Pages:
Author

Topic: 19 GPU in one Motherboard - page 3. (Read 6825 times)

full member
Activity: 362
Merit: 102
September 24, 2017, 01:43:28 PM
#36
Just got off the phone with OP. Smiley There's some sort of hardware limitation on 13 GPUs. I guess P106s work because they have no outputs, and supposedly AMD cards may also work because they have different drivers.

So yeah I need either P106 cards or AMD cards. I'm not yet sure how to run both AMD and NVDA on nvOS though. But perhaps I will put in some orders for Vega 64s to mix with my 1080tis.
sr. member
Activity: 2142
Merit: 353
Xtreme Monster
September 23, 2017, 09:13:53 PM
#35
Seems no matter what I do, 14th card results in completely not booting at all

You must be doing something wrong or you must not have done something right to make it to work properly, its one or the other, The asus b250 mining expert should have no problem booting all 16 cards, 8 amd and 8 nvidia on windows.
sr. member
Activity: 2142
Merit: 353
Xtreme Monster
September 23, 2017, 09:08:33 PM
#34
I wonder how Asus b250 mining expert compares to biostar b250 btc pro. I still think the Asus b250 mining expert is way better than the biostar in everything.
hero member
Activity: 1036
Merit: 606
September 23, 2017, 08:13:58 PM
#33
2nd and 3rd server PSUs are powering all GPUs

Does not boot to bios. Screen does not register a signal.

Doesn't matter which slots are plugged in and which are empty, the 14th one just kills it.

Is this getting to look like a bad motherboard or one that I may have already damaged from improper initial configurations?

Also the B250 chipset only supports up to 12 dedicated PCI-E lanes natively. The rest of the PCI-E slots use a PCI-E expansion chip. Nvidia cards are notorious for having PCI-E compatibility problems when using expansion chips, which is why they don't work with those x1 to 3 x1 PCI-E expansion cards.



 
member
Activity: 101
Merit: 10
September 23, 2017, 08:10:23 PM
#32
Hello,

i just wanted to say, that we managed to get running with no big problems 19 GPUs in one Motherboard.

MOBO: Asus B250 Mining Expert

GPUs: 13x ASUS STRIX GTX 1060 6 GB + 6x ASUS MINING P106

MAIN PSU: 2400W
Secondary PSU (for powering pcie extenders and mobo) 750W
We have also 3th PSU in video, but in reality you need only main and secondary PSU.
16GB RAM

NO HDD just one 32GB FAST USB. SYSTEM [OS] nvOC easy-to-use Linux Nvidia Mining v0019

In windows we were able to get all cards do device manager, but mining was not running Sad

https://drive.google.com/drive/u/1/folders/0B4y5D_zzH1oeLWpIcGtQOXI1MUU

will post tomorow more videos and detailed analysis of mobo with manual how to do that.

Best regards

cReepas

(Invictus Mining Team)


It sounds great but useless because any gpu or any component has problem, you need to restart all the system.
- GPU hang for long time running.
- Failed Risers
- Unstable Power Supply.

It could use for your hobby. But I would avoid any motherboard over than 7 gpus, you might check all large farms, they only use 4 to 7 gpus for every rig.
sr. member
Activity: 468
Merit: 250
J
September 23, 2017, 08:04:35 PM
#31
If I tried to use 4GB RAM and a G3900 Celeron CPU with a 19 GPU rig like this, I would probably have a bad time right? I've got 9 GPUs on these specs right now, and wondering how long I'll be able to go on like this.

You would have to check the cpu usage, as long as 19 gpu's are working and cpu usage is less than 80% then is okay.

This mobo has usefull feature that shows what is wrong with PCIE! Smiley

Its far from plug and play but we actually dont have that many issues with stability of systems and if something goes wrong with pcie we see it.

cReepas

He is talking about hotplug gpu's.

Yes, I'm curious as well. What kind of CPU are you using for this build?
hero member
Activity: 1036
Merit: 606
September 23, 2017, 07:54:04 PM
#30
2nd and 3rd server PSUs are powering all GPUs

Does not boot to bios. Screen does not register a signal.

Doesn't matter which slots are plugged in and which are empty, the 14th one just kills it.

Is this getting to look like a bad motherboard or one that I may have already damaged from improper initial configurations?

If 13 GPU's are working, I doubt it's a bad motherboard. More likely it's a Nvidia driver limitation.
full member
Activity: 362
Merit: 102
September 23, 2017, 07:37:58 PM
#29
2nd and 3rd server PSUs are powering all GPUs

Does not boot to bios. Screen does not register a signal.

Doesn't matter which slots are plugged in and which are empty, the 14th one just kills it.

Is this getting to look like a bad motherboard or one that I may have already damaged from improper initial configurations?
full member
Activity: 140
Merit: 100
September 23, 2017, 07:35:22 PM
#28
As soon as I plug in the 14th GPU, I can't get this thing to boot. Any ideas?

Like no post? Not even to bios? I was going to say make sure you're powering everything. Depending on how you've got things connected you might have the 14th GPU on it's own in the 3rd PSU slots. If so are they getting power from 3rd PSU?
full member
Activity: 362
Merit: 102
September 23, 2017, 07:08:54 PM
#27
1080ti nvOS G3930 4GB ram

using an SD card on a card reader, two 2400w server PSUs and one 1600w ATX as the main PSU.

Seems no matter what I do, 14th card results in completely not booting at all, no bios or anything. Right back up when I remove one slot. Got another 4GB ram on the way already. Computer is stupid slow when I try to do anything on 13 cards, but seems to be mining stable so far.
hero member
Activity: 1036
Merit: 606
September 23, 2017, 07:06:02 PM
#26
As soon as I plug in the 14th GPU, I can't get this thing to boot. Any ideas?

What cards and OS?
full member
Activity: 362
Merit: 102
September 23, 2017, 06:59:17 PM
#25
As soon as I plug in the 14th GPU, I can't get this thing to boot. Any ideas?
legendary
Activity: 3416
Merit: 1059
September 12, 2017, 11:11:05 AM
#24
what we need is not a motherboard that have lots and lots of pcie...

what we need is an interface that can manage many cards where we can plug and play cards without turning off the system..where cards that have a problem would hang or stop separately without affecting the whole system so other cards will keep on mining.

with that kind of interface we can use almost any board out there..that interface should have a software management and hardware switches(like for disabling power) for the plug and play feature.

This mobo has usefull feature that shows what is wrong with PCIE! Smiley

Its far from plug and play but we actually dont have that many issues with stability of systems and if something goes wrong with pcie we see it.

cReepas

my old 775 motherboard circa 2006-2007..has led that shows connection state if connected..it doesn't show error color light but i can identify 1 out of 4 gpu is not mining by simply touching the gpus, the one that is not hot is not mining.
full member
Activity: 362
Merit: 102
September 12, 2017, 08:09:01 AM
#23
Currently, my 9x GPU rig has CPU1: 13.7% and CPU2: 90%

Also using 2.3 GB of 3.8 GB of ram.

So I should probably invest in some upgrades for more GPUs yeah?
sr. member
Activity: 2142
Merit: 353
Xtreme Monster
September 12, 2017, 07:53:27 AM
#22
If I tried to use 4GB RAM and a G3900 Celeron CPU with a 19 GPU rig like this, I would probably have a bad time right? I've got 9 GPUs on these specs right now, and wondering how long I'll be able to go on like this.

You would have to check the cpu usage, as long as 19 gpu's are working and cpu usage is less than 80% then is okay.

This mobo has usefull feature that shows what is wrong with PCIE! Smiley

Its far from plug and play but we actually dont have that many issues with stability of systems and if something goes wrong with pcie we see it.

cReepas

He is talking about hotplug gpu's.
full member
Activity: 362
Merit: 102
September 12, 2017, 07:11:51 AM
#21
If I tried to use 4GB RAM and a G3900 Celeron CPU with a 19 GPU rig like this, I would probably have a bad time right? I've got 9 GPUs on these specs right now, and wondering how long I'll be able to go on like this.
member
Activity: 67
Merit: 10
September 12, 2017, 02:26:14 AM
#20
what we need is not a motherboard that have lots and lots of pcie...

what we need is an interface that can manage many cards where we can plug and play cards without turning off the system..where cards that have a problem would hang or stop separately without affecting the whole system so other cards will keep on mining.

with that kind of interface we can use almost any board out there..that interface should have a software management and hardware switches(like for disabling power) for the plug and play feature.

This mobo has usefull feature that shows what is wrong with PCIE! Smiley

Its far from plug and play but we actually dont have that many issues with stability of systems and if something goes wrong with pcie we see it.

cReepas
legendary
Activity: 3416
Merit: 1059
September 10, 2017, 07:34:53 AM
#19
what we need is not a motherboard that have lots and lots of pcie...

what we need is an interface that can manage many cards where we can plug and play cards without turning off the system..where cards that have a problem would hang or stop separately without affecting the whole system so other cards will keep on mining.

with that kind of interface we can use almost any board out there..that interface should have a software management and hardware switches(like for disabling power) for the plug and play feature.
sr. member
Activity: 2142
Merit: 353
Xtreme Monster
September 10, 2017, 05:35:07 AM
#18

Good work, too many things to test it out with this motherboard. I hope it goes on sale soon. Asus said AMD will create a driver that supports all 19 cards by the end of this year. Let's hope they deliver it.
member
Activity: 67
Merit: 10
September 10, 2017, 04:52:28 AM
#17
Hello,

im adding video: https://www.youtube.com/watch?v=DjFP1BfN5Xs&t=232s

Best regards

cReepas

Invictus Mining Team
Pages:
Jump to: