Pages:
Author

Topic: 7xGPU + Windows // modding AMD video driver is the answer - page 2. (Read 71737 times)

lbr
sr. member
Activity: 423
Merit: 254
Since a riser connected to the motherboard isn't electrically isolated

Most USB risers are.

Anyway, lets stop discussing PSUs here, it's a bit offtopic ; )
hero member
Activity: 1036
Merit: 606
Quote
I'm pretty sure that's a myth that keeps getting re-hashed(for some reason) from the ribbon powered riser days..
(this coming from a guy that just murdered 2 cards tho.)
Usb powered risers seem to be totally independent. I've been running dual psu's for months in pretty much any fashion without issue(until now.) even non-grounded together.

Case in point: When said idiot forgot to turn off the second rig/psu before uplugging the pcie and molex from his poor 480.. that psu was powering both riser and gpu. it was also powering 1 570 the same way.. all 6 570's survived. The massive back surge when unplugging the pcie connector is (most likely) what took out the 470 that was plugged into the motherboard slot on the second rig.. yet the 480 + 470 on risers survived.(slot still works fine)

I dunno, just ordered a server psu to play with.. maybe I'll end up frying all my cards/burning the house down :/

A PCI-E 1x slot on a motherboard has three 12V lanes

http://pinouts.ru/Slots/pci_express_pinout.shtml

Since a riser connected to the motherboard isn't electrically isolated, it makes sense to have the same psu that powers the motherboard also power the risers connected to it. The way he explained was that otherwise it creates a situation where the 2 PSU's fight to regulate the 12V line.

https://bitcointalksearch.org/topic/m.18399752
newbie
Activity: 8
Merit: 0
In researching the dual psu power issue i've heard some advocate powering the riser and PCI-E power connector on the card with the same PSU. Others say the psu that powers the motherboard must also power alll the risers. The slave psu should only power the PCI-E power connector on the card. Haven't found a definitive answer but it seems the people having cards die were using the slave psu to power both the risers and the PCI-E power connector on the card.

https://bitcointalksearch.org/topic/m.17827649
I'm pretty sure that's a myth that keeps getting re-hashed(for some reason) from the ribbon powered riser days..
(this coming from a guy that just murdered 2 cards tho.)
Usb powered risers seem to be totally independent. I've been running dual psu's for months in pretty much any fashion without issue(until now.) even non-grounded together.

Case in point: When said idiot forgot to turn off the second rig/psu before uplugging the pcie and molex from his poor 480.. that psu was powering both riser and gpu. it was also powering 1 570 the same way.. all 6 570's survived. The massive back surge when unplugging the pcie connector is (most likely) what took out the 470 that was plugged into the motherboard slot on the second rig.. yet the 480 + 470 on risers survived.(slot still works fine)

I dunno, just ordered a server psu to play with.. maybe I'll end up frying all my cards/burning the house down :/
lbr
sr. member
Activity: 423
Merit: 254
Also, about several days ago I've replaced flat ribbon powered riseres on my oldest rig alive. I'ts been running with two PSUs non stop for about 4 years with no issues ; ) Except some risers burned cause of bad solder.

One PSU(small one) is powering motherboard and stuff and 1 riser and the same GPU which is using that riser. The other is powering 5 other risers and GPUs.
Rig used to run 6950/7950/290 cards.

The PSUs are different. Also one of them(small one) had capacitors gone bad and rig started to freeze. After capacitors replcaement all is good till now ; )
lbr
sr. member
Activity: 423
Merit: 254
In researching the dual psu power issue i've heard some advocate powering the riser and PCI-E power connector on the card with the same PSU. Others say the psu that powers the motherboard must also power alll the risers. The slave psu should only power the PCI-E power connector on the card. Haven't found a definitive answer but it seems the people having cards die were using the slave psu to power both the risers and the PCI-E power connector on the card.

https://bitcointalksearch.org/topic/m.17827649

In this case - one PSU is powering all the risers and one PSU powering PCIe power connectors - there may be current flow through the GPU from one PSU to another. In the case of one PSU is on and another is off - the one which is off will/may blow up because it will receive voltage on it's output. And the PSU which is on may/will blow up because who knows how many W the off PSU will consume using A through it's output ; )
And the GPU won't be happy at all. Prly also will burn.

However, ppl used to do it because of unpowered/un-cut risers - for example flat ribbon without +12v additional power. In that case riser -
Quote
because the risers voltage shares the line with the PCI-Express bus and if there is a tiny litte difference in voltage all hell break loose.
Nowadays, mostly ppl use USB risers, which provide all the voltages to GPU from it's external power connector. So they don't use PCIe bus voltages.
hero member
Activity: 1036
Merit: 606
In researching the dual psu power issue i've heard some advocate powering the riser and PCI-E power connector on the card with the same PSU. Others say the psu that powers the motherboard must also power alll the risers. The slave psu should only power the PCI-E power connector on the card. Haven't found a definitive answer but it seems the people having cards die were using the slave psu to power both the risers and the PCI-E power connector on the card.

https://bitcointalksearch.org/topic/m.17827649
newbie
Activity: 8
Merit: 0
I think this was the issue.
How exactly did you hook up second PSU?
Had 1 hx850 powering 5gpu+5 risers.. second powering 2gpu +2risers + 3 cards idling @ bios screen in the second rig..

I'm leaning towards the true cause of death may or may not have been related to some *Idiot* forgetting to turn off the second psu/rig before unplugging pcie and molex connectors.... Sad
(dang psu's that run fan-less until super hot anyway.)
lbr
sr. member
Activity: 423
Merit: 254
(powering 2 risers/cards with second psu..)

I think this was the issue.
How exactly did you hook up second PSU?

When using dual-PSU, you must be sure, that no current flows between both of them through the GPU.
For example powering riser with one PSU and the GPU with another through PCIe power.
Also, PSU grounds must be shorted.
And both PSUs must be on(or off) at same same time always. If not, one PSU will feed some power into the output of the another PSU.. and it will/may blow up.

Also, maybe you just overloaded PSU and it dropped +12v to.. say 10v or ripple have risen and killed the cards. Or both.
newbie
Activity: 8
Merit: 0
Well, the good news is I got 7 570's working in windows 7 with the mod! It was a simple power problem..

The bad news is, I thought it would be a 'great' idea to test the power theory by using 1 pcie chain + 1 molex chain from my 4 card rig that was using an ancient DFI lan party mobo.. + hx850i with that pc idling in "bios screen mode"..
First I tried 1 480 + 6 570's that worked fine.. then tried all 7 570's (powering 2 risers/cards with second psu..) worked fine.. both with multiple reboots to make sure it was working for real.

All was good until trying to hook the 4 card rig back up.. bios beeps.. thought it was BS'ing me at first but, guess not Sad.. power color red dragon 480 + a 470 that was minding it's own dam business in the motherboard slot -- both apparently dead.(tried in 3 pc's. fans don't even spin.. nothing in device manager/atiflash..)

FML! I guess. 1 480 & 470 survived at least.. grr.
Glad it says newbie under my nick some times..
lbr
sr. member
Activity: 423
Merit: 254
guess I'm doomed to linux?.. (because, #$&% win10-anything.)
Maybe Windows Server 2016? It looks like Win10 tho, but has all the crap disabled by default.
Or Server Core 2016 ; ) It does not have any UI at all. I used to run mining rigs on Server Core 2008.. painfull at first ; )
newbie
Activity: 8
Merit: 0
dang, 7 570's show up in win 7 with 0 error codes on a msi gaming 5 mobo but, only 6 work :/
with 7 plugged in, when starting mining software lan cuts out, stuff starts crashing/freezing/time to reboot.
 

guess I'm doomed to linux?.. (because, #$&% win10-anything.)
funny, I thought I'd be running 8 gpu's on this mobo..(lawl)
sigh.
newbie
Activity: 25
Merit: 0

Still error 43


Installed win10 and all cards worked straight away, no mod needed...
newbie
Activity: 25
Merit: 0
Try using the registry hacks for "DisplayLessPolicy" and "LimitVideoPresentSources" for all the GPUs (google). Once you reboot with the hack, disable the code 43'rd card, wait 30 seconds, re-enable it.


Still error 43
lbr
sr. member
Activity: 423
Merit: 254
Try using the registry hacks for "DisplayLessPolicy" and "LimitVideoPresentSources" for all the GPUs (google). Once you reboot with the hack, disable the code 43'rd card, wait 30 seconds, re-enable it.

Do they actually help AMD cards?
I thought that's for nVidia..
full member
Activity: 219
Merit: 100
Try using the registry hacks for "DisplayLessPolicy" and "LimitVideoPresentSources" for all the GPUs (google). Once you reboot with the hack, disable the code 43'rd card, wait 30 seconds, re-enable it.
newbie
Activity: 25
Merit: 0
Can somebody help,
I did get 6gpu working with this mod but added 7th gpu and it shows error 43. Tried to force the mod again but it didnt help.  Pci is set to gen1 latency 32/64.
Also removed drivers with ddu an re installed them, no help..
gpu driver [Guru3D]-radeon-crimson-15.12-with-dotnet45-win7-64bit


My rig setup

Windows 7
7x280x
8gb ram
MSI Z97 Gaming 5
2x psu
1000w
860w
normal pcie risers

Anything worth of try?
or do i have to update to win10 now?
lbr
sr. member
Activity: 423
Merit: 254
Hello

Which of you has successfully installed 7 gpu with asrock btc pro?

I can not boot with 7 gpu, and when I get to boot, I lose the driver or bsod, yet windows 10 recognizes my 7 gpu.
https://bitcointalksearch.org/topic/success-7-gpu-mining-with-asrock-h81-pro-btc-pcie-switchhub-1851487
full member
Activity: 164
Merit: 100
Is there anything special you have to do to that i left out?
Win10/2016

Took your advice and did the free upgrade to win10.  Worked fine with 16.9.1.  But besides that damn, win10 sucks.  Thanks,
newbie
Activity: 66
Merit: 0
I am running windows 7 AMD 16.9.1 driver.  I have 6 rx480 installed, 4 work and 2 show driver issues in windows manager.  Will this mod work for me?

If that is #43 error, it should work.

So I ran it and now I have 5 working cards instead of 4.  It seemed to fix one of them but not both.  Is there anything special you have to do to that i left out?

Thanks,

Try run it as administrator.
lbr
sr. member
Activity: 423
Merit: 254
Is there anything special you have to do to that i left out?
Win10/2016
Pages:
Jump to: