Pages:
Author

Topic: Anyone can test these 6 Pcie slots Ryzen mobos? - page 2. (Read 17999 times)

sr. member
Activity: 512
Merit: 260
I don't think I will ever build a rig that is not +12 gpus again. Im in love with the Biostar TB250-BTC PRO Ver. 6.x Intel LGA1151 works so well in comparison with my other mb's. I have 10 rigs and 2 has TB250 in the rest are a mixed of 97 and 270. AMD really need to make up some ground here.
newbie
Activity: 2
Merit: 0
Hey, people, Im posting this as I finally got working 6xGPUs on a Ryzen mobo on Windows 10 and I believe it can be useful to others that like me were struggling to achieve this.

System: AMD Ryzen 7 1700
MSI X370 SLI Plus with latest mobo firmware
6x ASUS Strix GTX 1070 non-OC version but OCed (Mem clock @ 8723 Mhz, Core @ 1903 Mhz)
Corsair 8Gb 3200 Mhz RAM
PSU Seasonic X-850 Watts 80+ Gold x2 (3 GPUs on each PSU), second PSU with metal clip power-on method.
Windows 10 Pro updated to latest everything

A little history first:
I tried every combination of slots and risers, they all work independently and up to 5 GPUs using the latest Nvidia driver (384.94) with Windows 10 booting up and hashing Zcash with EBWF with no problem. But no matter where I put the 6th GPU, whenever I did, Windows would not boot and would enter a loop where eventually the error repair screen appeared.
If I booted to safe mode, Windows showed the 6 GPUs on device manager, but the driver wouldn't be loaded so I couldn't mine with it. When doing a DDU driver uninstall, booting to normal mode worked and Windows would show the 6 GPUs on device manager but as Generic display cards, not being able to mine of course.
So I bought ethOS and after the initial learning curve, got all GPUs to mine but would constantly have random crashes that didn't lock the GPUs but were very annoying cause I lost hours of possible mining (which translates directly in loss of revenue). Because I didn't want to open a port on my router to access the rig remotely, I used Teamviewer on a second machine that mines in Windows at a very lower rate (2x GTX 980) but super f***ing stable (not a single crash ever, even on overclock, against the 2 crashes per day of ethOS, even on non-OC settings) that allowed me to access the linux rig on the local network.
Still, I wanted to mine on Windows so that I can 1) have rock-solid stable mining and 2) be able to use Teamviewer to manage the rig as if I were sitting right in front of it.
Checking the main panel on ethOS I noticed that the GPUs were listed as 21-22-23-24-26-27, missing the "25", so I thought that maybe the sequence of population of the pcie slots might have something to do with it.
In the mobo, the slots go from top to bottom: x1 - x16 - M.2 - x1 - x16 - x1 - x8
So I bought an M.2 to PCIe x4 adapter to add a 7th GPU, because whatever and for some reason I thought "why not try it on Windows?". So I set it up as following: x1 / x16 / M.2 / x1 / x16 / x8 and turned it on....
And that motherf.....g Windows just booted right up, loaded the OC config with GPU Tweak II and started mining like a devil at 2830-2850 Sol/s. Freaking beautiful.
So, 6 GPUs on X370 ON Windows 10 is absolutely doable.
I hope this helps others.

I can submit pics if needed

Later.

Edited: cleaning text, spelling, typos and stuff...

Did you enable 4G decoding?
full member
Activity: 305
Merit: 148
Theranos Coin - IoT + micro-blood arrays = Moon!
Haha after 3 days struggling I finally achieve the impossible, 6x1080ti on a AM4 motherboard mining guarantee, not counting 2x M.2 slots. I could achieve 8xGPU if i have enough part which are on the way shipping.

Spec: Ryzen 7 1700, Asrock X370 Killer SLI/ac (2xPCIe 16X + 4xPCIe 1X + 2 SATA/PCIe NGFF M.2). Windows 10
Riser: 6x Ubit riser latest version with LED. One came defective, contacting them for the replacement.



MSI Gaming Pro/Gaming Carbon Pro will do 5 / 6 cards without many issues.  Update to Fall Creators Update, have DDU handy if something goes wrong the first time.  I found it was pretty easy to get 6 Nvidia cards working on these AMD4 motherboards.  It was trickier with the AMD Vega 56's, but Fall Creators Update and other tweaks solved that as well.  The nice thing with these Ryzen builds is that you can have several threads of the CPU working away on Monero while the GPUs mine ZEC or something else.  Easier to pull off on Ryzen motherboards than with i3/i5 Intel builds.
newbie
Activity: 12
Merit: 0
Haha after 3 days struggling I finally achieve the impossible, 6x1080ti on a AM4 motherboard mining guarantee, not counting 2x M.2 slots. I could achieve 8xGPU if i have enough part which are on the way shipping.

Spec: Ryzen 7 1700, Asrock X370 Killer SLI/ac (2xPCIe 16X + 4xPCIe 1X + 2 SATA/PCIe NGFF M.2). Windows 10
Riser: 6x Ubit riser latest version with LED. One came defective, contacting them for the replacement.

So with only 5x riser on hand the only possible way I think of is one of the GPU must be connected directly to the pcie 16x on the motherboard and it's just how i did it. So here is how:

1. Download this portable freeware and have it ready http://www.guru3d.com/files-details/display-driver-uninstaller-download.html
2. Download latest driver for your GPUs and have them ready.
3. Download and flash latest BIOS for your mobo (mine is version 4.50 2018/1/16)
4. Go to your UEFI/BIOS, find and enable "IOMMU" equivalent to "4G decoding", also change 2 PCIE (or PEG) options to "GEN1" (forgot what exact settings are but its there).
5. Always reboot or startup your system in SAFE MODE during GPUs installation. (WindowsKey > msconfig > Boot tab > Boot option: safe boot). Repeated this everytime you reboot your computer for a great chance to success.
6. Boot in safe mode, run DisplayDriverUninstaller.exe, choose Clean and Shutdown.
7. Install your GPUs, all at once or one by one (like i did) whichever work for you. My setup is 5 GPUs via risers and one in the slot 2 PCIE 16X.
8. Patiently* boot in safe mode. Check device manager to see if 6 GPUs show up. If they do appear then go ahead and install driver, reboot to normal mode and enjoy. If not then repeat carefully. As long as I reboot into safe mode all my steps are solid and work 100%.

*After new GPU(s) installed, the main display should be the GPU installed on the first PCIE slot aka PCIE slot 1. Either that or try to plug your display cable (HDMI/DP/VGA...) into the PCIE slot 2. Worst case then try all other GPUs until windows show up on your monitor.

Also I don't recommend any MSI boards for this example unless they are made for mining. Look at their mobo specification they tried to hide as much information as possible, the missing information can only be found in their specify manual of each mainboard. What they hide are some SATA and PCIE lanes will be disabled if there are devices connected to the other lanes. That pretty much explain why MSI's owner has no luck stacking their GPUs. Best brand for experiment should be Asus and now I just found out Asrock too: No PCIE lanes are disabled no matter how many you plug in, they have a better chance to work. Also Asus and Asrock board mostly come with 6 or more PCIE and 2 M.2 which is a plus for both of the brand.

Well, I think I cover almost everything I did to get 6(+2) GPUs working on a AM4 motherboard. If you think i miss anything or have any question please send me a reply. If you find my post helpful please share them as much as you can. I am an AMD fanboy and I want AMD to be popular not only in gaming but also in mining. I'm looking forward to build my 2nd rig with Zen2 if possible. Good luck and happy mining.

Edit: Here is proof guys.https://imgur.com/a/xoTqY
newbie
Activity: 2
Merit: 0
I recommend you get a M2 card. The ultra m.2 slot on the pro4 should work and I have found them to detect easily without any issues so far. Just make sure you buy one that has good reviews. The first one I bought was from BeeEaster on Amazon and it was starting to burn up as soon as I turned on my computer. I did not have the power cable connected to the m2 card as is recommended.
newbie
Activity: 1
Merit: 0
I too am having problems with the AB350 Pro4 Mobo. It works perfectly fine with 5 GPUs, but as soon as I use the two PCIe x16 slots - the one on the secondary PCIex x16 is not recognized.

Windows boots every time without any problems, but only 5 GPUs are recognized. I haven't tried using an M2 adapter so far and I am thinking about switching to EthOS, but spending 40 bucks on the off chance that it will recognize the 6 GPUs properly doesn't seem a good idea atm.

If there is anyone that managed to get 6 GPUs working on that stupid mobo, please enlighten your fellow miners Smiley
newbie
Activity: 2
Merit: 0
Wanted to provide my experience with a few boards but for anyone looking to get a AM4 mining board I just wanted to say that AsRock Killer SLIac X370 seems to work right out of the box with at least 5 cards. I used 5 of the slots on the board and it booted every time I added one without a single hiccup. It has two additional M.2 slots that are both designated for PCI use so I believe it could do 7 as I have had good success with the M.2 expansion slots on another AsRock board.

These are the other ones I have tried:

Asus Prime B350 Plus - Worked with 4 GPUs with no problems and is stable but it only has 4 PCI slots and it seems like the M.2 would not detect at all for some reason on this board.

MSI Krait Gaming X370 - Big disappointment. Could not get it to detect more than 2 cards on the PCI slots.  Tried different BIOS versions and settings.

AsRock Fatal1ty Gaming K4 AB350 - Have it running with 4 PCI + 1 M.2 currently but to get that 4th card working I have to unplug the usb from the riser of the 4th PCI card when it shows the windows loading screen. It was a trick I learned here but I have had to to it every time my system restarts or else it just constantly reboots. I was gonna try the second M.2 slot also but saw in the manual it is only for storage unlike in the Killer X370 board.
newbie
Activity: 2
Merit: 0
UPDATE:
I did succeed in getting a 5th card installed and mining. Pretty much just a "reboot and install drivers until it works" situation, but it might have had something to do with skipping the 4th PCI-e slot (x16 #2). I can't think of anything else I did.

So to reiterate:
BIOS mod each card one by one-
1. Install the card to the first (top) x16 slot
2. Use ATIFlash/Polaris to set your straps
3. Benchmark/tweak straps to your liking
4. Tweak settings in WattMan/OverdriveN/Afterburnrer or whatever you prefer until you're happy with them
5. Write preferred settings into BIOS
6. Reboot/patch/test
7. Once the card is modded and running the way you like pull it off the motherboard and GOTO10 for the rest of the cards

Next just install the cards one by one starting from the top PCI-e x1 slot and moving down (skip slot #4/PCIe x16 #2).
1. Shutdown
2. Install next card
3. Boot up/patch drivers if needed
4. Reboot, confirm card shows up
5. GOTO 10 until you have 5 cards working

About the only thing left to do at that point is build your .bat files and fire the rig up. If I manage to get a 6th card installed and working I'll update the thread again.

System:
ASRock AB350 Pro 4
Ryzen 5 1600x
Viper 4GB
Kingston UEFI 16GB USB3.0
Corsair HXi 1200W
v0008c risers (Amazon)
Crimson BETA blockchain drivers UPDATE: Also working with 17.2 Crimson and 17.12 Adrenaline drivers
Windows 10 Home N

After some trouble, I managed to get 4 cards up and running reliably (99.7% uptime since NYE) by skipping slot 4 (PCI-e x16 #2). Getting 4 cards up was as simple as BIOS programming them one by one then adding them to the system one at a time; it took a few driver patches after adding new cards but Win10 boots really fast off the USB so it was only about a two hour process to benchmark them all, mod them and install them.

Once I hit that 5th card though I started to have problems booting into Windows.
newbie
Activity: 3
Merit: 0
Ugh I think my ab350 pro 4 is a lemon. It can't recognize more than 4 GPUs even though I've checked with 3 different risers, tried gen1 vs gen2, latest BIOS, using 4g decoding, everything

What a waste of time
newbie
Activity: 7
Merit: 0
the exact same issue here mate!

fucked up Sad

it might be a limitation from the motherboard, not even bios updates.

I did read some user with Ethos had been able to run all 6
newbie
Activity: 2
Merit: 0
So I picked up the ASRock AB350 Pro 4 because I wanted to build a new desktop and figured what better way to benchmark than mining?

System:
ASRock AB350 Pro 4
Ryzen 5 1600x
Viper 4GB
Kingston UEFI 16GB USB3.0
Corsair HXi 1200W
v0008c risers (Amazon)
Crimson BETA blockchain drivers
Windows 10 Home N

After some trouble, I managed to get 4 cards up and running reliably (99.7% uptime since NYE) by skipping slot 4 (PCI-e x16 #2). So far I haven't figured out the magic mojo necessary to get 5 cards online but getting 4 cards up was as simple as BIOS programming them one by one then adding them to the system one at a time; it took a few driver patches after adding new cards but Win10 boots really fast off the USB so it was only about a two hour process to benchmark them all, mod them and install them.

Once I hit that 5th card though I started to have problems booting into Windows. I've tried a few things but I'm not going to go to any extremes. Once I get that lucky NowInStock notification and I snag myself a Asus ROG STRIX GTX1080Ti this will be my primary desktop and gaming system so I don't want to muck about with the BIOS too much.
hero member
Activity: 714
Merit: 512
The AB350 pro 4 works with 6 GPUs, I have 2 along with a ryzen 7 in both

I have this board and I am having issues with six GPUs -- any tips ?
newbie
Activity: 4
Merit: 0
confirmed 6 x nvidia gpu's on asrock ab350. bios 3.20, no bios setting tweaks required. install at least one card directly on 16x primary slot. do not install any cards or risers on secondary 16x slot. needed to use m2 > pcie adapter for 6th card. ubuntu 17.04.
Missed this one. Will try m2 > pcie adapter.

M.2 -> PCIe adapter didn't help. The card is not detected by windows when connected to M2_2, and goes crazy when connected to M2_1 (just the same as to PCIE4).

Tried ubuntu 17.10 and arch. Kernel panic on boot.

Also I don't recommend updating bios to P4.40. It doesn't boot even with 5 cards. Didn't try less cards.

It is stable on P3.20 with 5 cards and win10 though. So I guess I have to leave it at that.
newbie
Activity: 43
Merit: 0
Thanks !
newbie
Activity: 4
Merit: 0
Hi,

Can i add 2 gtx1070 Ti (for mining) to a Gigabyte GA-AB350-GAMING 3 ( on the 2 others PCI-E 3.0 16x ports, without risers) when i have already 1 gtx 1070 Ti in the first PCI-E 3.0 16x port (for gaming) ?

Or do i need absolutely risers for those 2 cards ?

Thanks in advance


No.

Risers are needed either for pciex16 -> pciex1 transition or if directly installed card blocks another slot.

If you can install 3 cards in 3 slots directly, do so. It's probably even better.
newbie
Activity: 43
Merit: 0
Hi,

Can i add 2 gtx1070 Ti (for mining) to a Gigabyte GA-AB350-GAMING 3 ( on the 2 others PCI-E 3.0 16x ports, without risers) when i have already 1 gtx 1070 Ti in the first PCI-E 3.0 16x port (for gaming) ?

Or do i need absolutely risers for those 2 cards ?

Thanks in advance

newbie
Activity: 4
Merit: 0
confirmed 6 x nvidia gpu's on asrock ab350. bios 3.20, no bios setting tweaks required. install at least one card directly on 16x primary slot. do not install any cards or risers on secondary 16x slot. needed to use m2 > pcie adapter for 6th card. ubuntu 17.04.
Missed this one. Will try m2 > pcie adapter.
newbie
Activity: 4
Merit: 0
I am trying to build a rig of 6 RX570 on ASRock B350 Pro4 (with Ryzen 3 1200).

Ubuntu 16.04.3 doesn't boot when even just one card is connected with riser. Fails with kernel panic. Both live-usb and installed without connected riser.

On Windows 10, 5 cards are working good.
But the 6th card (that is connected to second PCIe x16 with riser) isn't detected properly most of the time (like 4 in 5 boots). And even when it is detected, after some time of miner running, pc freezes and that's it.
Tried to swap slots, and the problem seems to be in second PCIe x16 slot.
Tried different BIOS versions: 3.20 (stock), 3.30, 4.40, 2.50. Same shit.
On 4.40 the AMD PBS section appeared again (mentioned in the thread before), I tried to switch to 2x8 mode, and that gave me 5 short beeps and black screen (no signs of boot) if there are cards connected with riser.

Notes:
1. Cards are with hacked bios. All are working fine, tried them on another working rig.
2. One card is connected directly into the first PCIe x16, without riser. It's just more convenient. Pity if that's the cause of the problem.

Any ideas how to make 6th card work stable?
full member
Activity: 306
Merit: 100
anyone here who can help me.

im using asrock ab350 pro 4.

i have a 1070ti at pcie 3.0

trying to plug in a 1060 on the other pcie 3.0 slot

but after that my windows can only detect the 1060?

do i need to use riser for this to work?

tried reinstalling the drivers
newbie
Activity: 7
Merit: 0
Hi,
I finally upgrade my rig with x370 carbon and 4 Vega 56 and add 2 Vega 64.
I struggle a bit to get it working. Now every card are up in windows but I managed to get it working only with 5 Vega.
Indeed, the Vega which is on first pcie x16 port getting freezing the computer when I compute with it. And this is a new Vega 64 which I bought.

What do you think? IS it the new Vega 64 that is break or this is the fact to run on the pcie x16?
(I can still return the card and get refund, and I don't have an extra riser to try..)

I bought a new usb riser and my rig is now stable:
x370 carbon with 4 Vega 56 and 2 vega 64

What's screw me up for a a couple of night was to use the the pie x16..
With plugging all cards on usb risers it works nicely. (from drives installation to reg optimization)
Pages:
Jump to: