So, I am getting ready to strap-mod the bios on my 2nd mining rig. However, I realized that I need to know the device numbers of each of the 5 GPUs in order to do this, since the actual bios flash command line specifies the device to flash.
This was not a concern on rig#1 because this has 4 identical MSI 470s in it. However, on rig #2 I have 1 sapphire nitro 470 8GB in the mix, so I need to make sure I am flashing the correct cards. It may actually be that that sapphire takes the same strap-mod (more on that later) but i really need to know how to identify which device number windows as assigned to each of my 5 GPUs.
If someone could point me to a resource of education on this or suggest a way to easily identify them on Windows 7 I would appreciate it.
For instance, what determines which device # is assigned? Is it the order you install the cards? Or the PCIe slot they are plugged into? Or, is it something else?
The PCIe slots on my ASRock BTC R2 Mobo are numbered 1-6 from right to left, with the lone x16 slot (currently unoccupied) being slot #2. However, when I go into device manager and check the properties for each GPU, it shows only PCI Bus 2-6 for the 5 GPUs, meaning it shows nothing as PCI Bus 1, but I really don't know how that correlates anyway since I know the device #s I am looking for begin with 0, so am assuming i should be looking for device #s 0-5, not 1-6. FYI, my lone Sapphire is plugged into PCIe slot #4 (3rd installed slot from the right) and I am pretty sure (but not 100% confident) that it was also the 3rd GPU I installed. Should this mean that this is device #2 (after 0 and 1)? or, is it not that straight forward. Thanks in advance for any suggestions.
do not be stupid pull the sapphire and just flash the other cards. (all the same correct?) msi rx 470's
then flash the sapphire by itself. and I am really pretty sure you need a different bios to flash it or brick city
Well yeah, obviously I could pull that orphan Sapphire as you mention, but I would rather avoid that if possible for a couple of reasons. Firstly, I had a hell of a time getting 5 GPUs working on this rig so aside from eventually trying to get a 6th working in that x16 slot, I'm not eager to start unplugging and re-plugging cards in case it screws things up again.
The other reason is this... I am even unsure how pulling that card would affect my device numbers? For instance, will windows re-number the GPUs after reboot only based on the now 4 that are plugged in? meaning they would then be 0, 1, 2 & 3?
Is windows really so lame that it doesn't have a utility or easy way to ID which GPU it assigned which device number? It seems more likely that there is a way, I just can't figure it out.
Partly Windows IS that lame, partly limitation of the device drivers, partly that there are very few folks that run more than ONE card per machine so it is a very low priority for M$, AMD, or NVidia to bother trying to fix.
Best way to flash the cards is do ONE AT A TIME - only one in the rig you are using to flash them with - that way you only have to worry about bricking one card at a time, and can verify that the flash for each card worked.
Thanks to all for your advice on my issues. I had somewhat of a breakthrough last night in regard to getting a 6th GPU working on my ASRock BTC Pro R2 Mobo.
I realized that I had not tried plugging it in (to that pesky x16 slot) since I ran that 6xGPU registry mod, so I figured I would give it a try. So, I shut down rig#1 and stole a MSI 470 from it, plugged it into that x16 slot on rig #2 and powered up. Unfortunately, I got the same problem of the machine not booting up. However, out of the blue an idea came to my to try to plug the monitor into that x16 slot card and see if that did anything. Amazingly, after making this switch, I was able to get it to boot with 6 GPUs for the first time. Unfortunately, 2 problems (maybe the same problem but 2 symptoms) showed up. #1 was when it made it to the Windows login screen, it switched to that weird magnified view... you know, where the screen is showing like 3-4x zoomed in so you can't even see full windows that you open? To me, this usually occurs when there is a video driver issue. So, I check device manage and of course, this newest GPU is showing up with that error 43. Slightly optimistic due to the fact that running the 6xGPU registry mod fixed this problem with GPU#5 on this machine, I ran it again, as Admin, but instead of fixing it, it momentarily showed all 6 GPUs with error 43, but then after reboot, 1-5 are ok again but #6 still shows error 43. I tried it several times with the same result.
After about an hour of down-time on both rigs, I decide to return the GPU to it's home on rig#1, but encouraged that I could at least get it to boot up with 6 GPUs, I went ahead and ordered another 470 from Amazon to arrive Saturday. This way I can experiment on rig#2 without taking rig #1 down.
I think the first thing I probably need to try is using the GPU driver cleaner that Citronick recommended, them hopefully running 6xGPU mod again will get that 6th GPU working. I also had a thought that maybe putting a dummy HMDI plug in that x16 GPU could help, meaning keep my monitor plugged into PCIe slot #1 (x1) and using the dummy plug in #2 (x16) just to get it to boot.
Anyway, seems like a little progress and have some things in mind to try, but other suggestions would be welcomed.