Pages:
Author

Topic: Can you have 2 PSU's connected to 1 Rig? (Read 5897 times)

sr. member
Activity: 840
Merit: 255
SportsIcon - Connect With Your Sports Heroes
January 04, 2014, 08:44:50 PM
#26
if you use risers and the dual-psu. However you're paying for risers and Add2PSU adapter that will be worthless when you end mining

You don't need Add2PSU.  Just use paper clip to short green and black wires in 24 pin ATX connector.  Plug both PSUs to the same power bar/surge protector and flip the switch to turn both of them at the time.  That adapter is useless.

You are right about that 3 cards per motherboard simplifies things.   No support racks to hold the GPUs.  One PSU per rig.
But if you are building 10+ rigs, space is a factor so you want to run more than 3 cards per motherboard.  Ideally, 5-6 cards per system.
Whatever you can squeeze on the same 15A@120V circuit.
Understand that shorting pins on power supplies is yet another point of failure and asking for trouble if one is a newbie/intermediate miner. I don't disagree with your post and perspective though.

Shorting pins is a point of failure? I highly doubt that, not if you do it right. Exceptionally easy with a single strand of cat6 solid. In most cases it is not even necessary, and the riser cables I use have a 1x presence pin jump built in anyway. Besides, I think the jump is just for BIOS initiation, after that point you could probably remove it till you restart.
It is up to him to decide

Quote
Running 3 in one mobo directly is a great way to melt the cards down and obliterate the fans in well under 6 months. I know from experience, no amount of external fan velocity or volume will keep three cards stacked that close in any decent running temps.
Did you undervolt the cards? That said, running cards directly on the motherboard isn't even my main point, but it is rather to not connect more cards than the available PCI-e 16x slots.



It doesn't matter if the cards are undervolted or not, they will overheat directly on the motherboard. The reference cooler designs are better, but there are no blower style 280/270 cards that I know of and the 290's run so hot you'd be nuts to try. It would likely limit your hashrate, and the costs of risers might be made up by the extra hashpower over the life of the machine. All that heat will play hell on the motherboard too, especially if it's not supported properly. That's a lot of weight, and a lot of heat...

Why exactly is that you recommend staying to x16 slots only? Is it just some wild hunch or something? Doesn't really make sense, since you advocate the use of riser cables now apparently? The x16 risers are horribly designed and more prone to failure with the extra 15 lanes of unnecessary connections on a crummy ribbon cable, and they are a serious airflow restriction for absolutely no reason! If you use x1 risers, what difference does the slot make? You'd then probably need to jump the x1 presence pin at that point, too.  

I have several rigs with multiple power supplies running 6 cards per board, and they do just fine. I would have spent double on supporting hardware if I stayed with 3 cards/rig, for no reason.
Do you realize that what we are discussing here is initial rigs? Do you read the intent or do you want to go on cherry picking and derail? Will you pay or help him if he makes initial mis-steps (e.g. gets faulty risers, fails to short the pins, etc...)?

@OP: With time improve your design and efficiency. Now, start SIMPLE and SAFER.
sr. member
Activity: 364
Merit: 250
January 04, 2014, 08:07:05 PM
#25
if you use risers and the dual-psu. However you're paying for risers and Add2PSU adapter that will be worthless when you end mining

You don't need Add2PSU.  Just use paper clip to short green and black wires in 24 pin ATX connector.  Plug both PSUs to the same power bar/surge protector and flip the switch to turn both of them at the time.  That adapter is useless.

You are right about that 3 cards per motherboard simplifies things.   No support racks to hold the GPUs.  One PSU per rig.
But if you are building 10+ rigs, space is a factor so you want to run more than 3 cards per motherboard.  Ideally, 5-6 cards per system.
Whatever you can squeeze on the same 15A@120V circuit.
Understand that shorting pins on power supplies is yet another point of failure and asking for trouble if one is a newbie/intermediate miner. I don't disagree with your post and perspective though.

Shorting pins is a point of failure? I highly doubt that, not if you do it right. Exceptionally easy with a single strand of cat6 solid. In most cases it is not even necessary, and the riser cables I use have a 1x presence pin jump built in anyway. Besides, I think the jump is just for BIOS initiation, after that point you could probably remove it till you restart.
It is up to him to decide

Quote
Running 3 in one mobo directly is a great way to melt the cards down and obliterate the fans in well under 6 months. I know from experience, no amount of external fan velocity or volume will keep three cards stacked that close in any decent running temps.
Did you undervolt the cards? That said, running cards directly on the motherboard isn't even my main point, but it is rather to not connect more cards than the available PCI-e 16x slots.



It doesn't matter if the cards are undervolted or not, they will overheat directly on the motherboard. The reference cooler designs are better, but there are no blower style 280/270 cards that I know of and the 290's run so hot you'd be nuts to try. It would likely limit your hashrate, and the costs of risers might be made up by the extra hashpower over the life of the machine. All that heat will play hell on the motherboard too, especially if it's not supported properly. That's a lot of weight, and a lot of heat...

Why exactly is that you recommend staying to x16 slots only? Is it just some wild hunch or something? Doesn't really make sense, since you advocate the use of riser cables now apparently? The x16 risers are horribly designed and more prone to failure with the extra 15 lanes of unnecessary connections on a crummy ribbon cable, and they are a serious airflow restriction for absolutely no reason! If you use x1 risers, what difference does the slot make? You'd then probably need to jump the x1 presence pin at that point, too. 

I have several rigs with multiple power supplies running 6 cards per board, and they do just fine. I would have spent double on supporting hardware if I stayed with 3 cards/rig, for no reason.
sr. member
Activity: 840
Merit: 255
SportsIcon - Connect With Your Sports Heroes
January 04, 2014, 07:52:20 PM
#24
if you use risers and the dual-psu. However you're paying for risers and Add2PSU adapter that will be worthless when you end mining

You don't need Add2PSU.  Just use paper clip to short green and black wires in 24 pin ATX connector.  Plug both PSUs to the same power bar/surge protector and flip the switch to turn both of them at the time.  That adapter is useless.

You are right about that 3 cards per motherboard simplifies things.   No support racks to hold the GPUs.  One PSU per rig.
But if you are building 10+ rigs, space is a factor so you want to run more than 3 cards per motherboard.  Ideally, 5-6 cards per system.
Whatever you can squeeze on the same 15A@120V circuit.
Understand that shorting pins on power supplies is yet another point of failure and asking for trouble if one is a newbie/intermediate miner. I don't disagree with your post and perspective though.

Shorting pins is a point of failure? I highly doubt that, not if you do it right. Exceptionally easy with a single strand of cat6 solid. In most cases it is not even necessary, and the riser cables I use have a 1x presence pin jump built in anyway. Besides, I think the jump is just for BIOS initiation, after that point you could probably remove it till you restart.
It is up to him to decide

Quote
Running 3 in one mobo directly is a great way to melt the cards down and obliterate the fans in well under 6 months. I know from experience, no amount of external fan velocity or volume will keep three cards stacked that close in any decent running temps.
Did you undervolt the cards? That said, running cards directly on the motherboard isn't even my main point, but it is rather to not connect more cards than the available PCI-e 16x slots.

sr. member
Activity: 364
Merit: 250
January 04, 2014, 06:06:30 PM
#23
1000W+ PSU is pretty expensive here, 400$ USD for the cheapest one, so I am kinda forced to use 2x 750W (135 USD each)... Sad

Is is it really that high risk of losing all the hardware while using dual PSU? Is it way more risky than using 1x 1500W?

No more of a risk than using a 1500W PSU.

Besides, the 1500W psu is probably just 6 250W power supplies tied together internally anyway. Chances are you can work out better load balancing with two smaller power supplies than one large one, too.  

 
member
Activity: 84
Merit: 10
January 04, 2014, 06:03:30 PM
#22
1000W+ PSU is pretty expensive here, 400$ USD for the cheapest one, so I am kinda forced to use 2x 750W (135 USD each)... Sad

Is is it really that high risk of losing all the hardware while using dual PSU? Is it way more risky than using 1x 1500W?
sr. member
Activity: 364
Merit: 250
January 04, 2014, 05:56:36 PM
#21
if you use risers and the dual-psu. However you're paying for risers and Add2PSU adapter that will be worthless when you end mining

You don't need Add2PSU.  Just use paper clip to short green and black wires in 24 pin ATX connector.  Plug both PSUs to the same power bar/surge protector and flip the switch to turn both of them at the time.  That adapter is useless.

You are right about that 3 cards per motherboard simplifies things.   No support racks to hold the GPUs.  One PSU per rig.
But if you are building 10+ rigs, space is a factor so you want to run more than 3 cards per motherboard.  Ideally, 5-6 cards per system.
Whatever you can squeeze on the same 15A@120V circuit.
Understand that shorting pins on power supplies is yet another point of failure and asking for trouble if one is a newbie/intermediate miner. I don't disagree with your post and perspective though.

Shorting pins is a point of failure? I highly doubt that, not if you do it right. Exceptionally easy with a single strand of cat6 solid. In most cases it is not even necessary, and the riser cables I use have a 1x presence pin jump built in anyway. Besides, I think the jump is just for BIOS initiation, after that point you could probably remove it till you restart.

Running 3 in one mobo directly is a great way to melt the cards down and obliterate the fans in well under 6 months. I know from experience, no amount of external fan velocity or volume will keep three cards stacked that close in any decent running temps.
sr. member
Activity: 840
Merit: 255
SportsIcon - Connect With Your Sports Heroes
January 04, 2014, 05:51:33 PM
#20
if you use risers and the dual-psu. However you're paying for risers and Add2PSU adapter that will be worthless when you end mining

You don't need Add2PSU.  Just use paper clip to short green and black wires in 24 pin ATX connector.  Plug both PSUs to the same power bar/surge protector and flip the switch to turn both of them at the time.  That adapter is useless.

You are right about that 3 cards per motherboard simplifies things.   No support racks to hold the GPUs.  One PSU per rig.
But if you are building 10+ rigs, space is a factor so you want to run more than 3 cards per motherboard.  Ideally, 5-6 cards per system.
Whatever you can squeeze on the same 15A@120V circuit.
Understand that shorting pins on power supplies is yet another point of failure and asking for trouble if one is a newbie/intermediate miner. I don't disagree with your post and perspective though.
legendary
Activity: 2702
Merit: 1468
January 04, 2014, 05:42:59 PM
#19
if you use risers and the dual-psu. However you're paying for risers and Add2PSU adapter that will be worthless when you end mining

You don't need Add2PSU.  Just use paper clip to short green and black wires in 24 pin ATX connector.  Plug both PSUs to the same power bar/surge protector and flip the switch to turn both of them at the time.  That adapter is useless.

You are right about that 3 cards per motherboard simplifies things.   No support racks to hold the GPUs.  One PSU per rig.
But if you are building 10+ rigs, space is a factor so you want to run more than 3 cards per motherboard.  Ideally, 5-6 cards per system.
Whatever you can squeeze on the same 15A@120V circuit.
full member
Activity: 223
Merit: 100
January 04, 2014, 05:11:50 PM
#18
My advice that you and others may or may not agree with: forget risers, dual-psu and shorting pins. The motherboard supports 3 cards, put 3 cards.

Let's say you want a setup with 5 - 6 280x cards.
Another base motherboard + CPU (e.g. Sempron 190) + ram + boot pendrive costs ~$200. You'd obviously not need a 2nd base system and save those $200 if you use risers and the dual-psu. However you're paying for risers and Add2PSU adapter that will be worthless when you end mining, shorting pins (you may fry a motherboard this way) or pay more for better risers, running the motherboard out of spec, and concentrating all your hash power on a single point of failure.

Make sure you undervolt the cards and use a large fan nearby.

Thanks for the reply, was helpful.

Just wondering, how do you undervolt the cards?

sr. member
Activity: 840
Merit: 255
SportsIcon - Connect With Your Sports Heroes
January 04, 2014, 03:49:19 PM
#17
My advice that you and others may or may not agree with: forget risers, dual-psu and shorting pins. The motherboard supports 3 cards, put 3 cards.

Let's say you want a setup with 5 - 6 280x cards.
Another base motherboard + CPU (e.g. Sempron 190) + ram + boot pendrive costs ~$200. You'd obviously not need a 2nd base system and save those $200 if you use risers and the dual-psu. However you're paying for risers and Add2PSU adapter that will be worthless when you end mining, shorting pins (you may fry a motherboard this way) or pay more for better risers, running the motherboard out of spec, and concentrating all your hash power on a single point of failure.

Make sure you undervolt the cards and use a large fan nearby.
legendary
Activity: 2702
Merit: 1468
January 04, 2014, 01:05:50 PM
#16
250MB/s PER LANE

Yes, I know, but is this sufficient enough? Smiley

Also, lets say the card has PCI-e v1.0 x1 and GPU us PCI-e v3.0 x16. Is PCI-e backwards compatible? Will it work or do I need v.3.0 motherboard too?

miner loads KBs  to GPU kernel to execute,  the responses are very small.  Passing bytes back and forth.

1MB/s would be more than enough.

You can put 1.0 device in 3.0 slot or 3.0 device in 1.0 slot.  The result is the same: 1.0 device and speed.
member
Activity: 84
Merit: 10
January 04, 2014, 12:58:21 PM
#15
250MB/s PER LANE

Yes, I know, but is this sufficient enough? Smiley

Also, lets say the card has PCI-e v1.0 x1 and GPU us PCI-e v3.0 x16. Is PCI-e backwards compatible? Will it work or do I need v.3.0 motherboard too?
legendary
Activity: 2702
Merit: 1468
January 04, 2014, 12:53:11 PM
#14
250MB/s PER LANE
member
Activity: 84
Merit: 10
January 04, 2014, 12:51:53 PM
#13

Yes, you can use 1x -1x in any of the PCI-e slots.

You mean You can use 1x-1x in 16x GPU card when slot is 1x?

1x-1x in any slot.  Some motherboards might require to short two pins.





will using 1x - 1x instead of 16x-1x affect hashrate in any way, and why?
and same question for 16x slot -  will using 1x - 1x instead of 16x-16x affect hashrate too?
No.  PCIe 3.0 transfer rate is 985 MB/s PER LANE in each direction.  That is more than enough for mining.
So 1x is all you need.

what about PCIe 1.0? It is about 250 MB/s
legendary
Activity: 2702
Merit: 1468
January 04, 2014, 12:47:26 PM
#12

Yes, you can use 1x -1x in any of the PCI-e slots.

You mean You can use 1x-1x in 16x GPU card when slot is 1x?

1x-1x in any slot.  Some motherboards might require to short two pins.





will using 1x - 1x instead of 16x-1x affect hashrate in any way, and why?
and same question for 16x slot -  will using 1x - 1x instead of 16x-16x affect hashrate too?
No.  PCIe 3.0 transfer rate is 985 MB/s PER LANE in each direction.  That is more than enough for mining.
So 1x is all you need.
member
Activity: 84
Merit: 10
January 04, 2014, 12:39:40 PM
#11

Yes, you can use 1x -1x in any of the PCI-e slots.

You mean You can use 1x-1x in 16x GPU card when slot is 1x?

1x-1x in any slot.  Some motherboards might require to short two pins.





will using 1x - 1x instead of 16x-1x affect hashrate in any way, and why?
and same question for 16x slot -  will using 1x - 1x instead of 16x-16x affect hashrate too?
full member
Activity: 223
Merit: 100
January 04, 2014, 12:16:15 PM
#10
You mean You can use 1x-1x in 16x GPU card when slot is 1x?

1x-1x in any slot.  Some motherboards might require to short two pins.




[/quote]

Thats great, thank you!
legendary
Activity: 2702
Merit: 1468
January 04, 2014, 10:35:12 AM
#9

Yes, you can use 1x -1x in any of the PCI-e slots.

You mean You can use 1x-1x in 16x GPU card when slot is 1x?

1x-1x in any slot.  Some motherboards might require to short two pins.



member
Activity: 84
Merit: 10
January 04, 2014, 10:27:49 AM
#8


Yes, you can use 1x -1x in any of the PCI-e slots.

You mean You can use 1x-1x in 16x GPU card when slot is 1x?
legendary
Activity: 2702
Merit: 1468
January 04, 2014, 10:18:34 AM
#7
You mean PCI-e 2.0 x1 slots? If they are PCI-e too, then yes, but You would need riser cables to connect PCI-e x16 card to PCI-e x1 slot (Riser with x16 female and x1 male)

I would suggest using risers x16--> x16 too on those 3 x16 slots aswell.

Hi,

Yeah, meant PCIe 2.0 x 1 slots. Im going to buy some risers this weekend. Does the risers I need have to be powered riseres? For either the PCI-e x 16 or PCI-e x 1 slots? I plan on having 3 gpus connected via PCI-e x 16 and 1 connected via PCI-e x 1.

I saw people recommending powered ones, why? I don't know, if someone could explain it I would be gratefull Smiley

If you use more than 3 cards you should use powered risers.  The reason is that the 24 pin connector might not provide enough +12V current to power PCI-e bus.  Some motherboards have a dedicated PCI-e 6 or 8 pin connector right on the motherboard.  If you don't provide enough power to PCI-e bus, the yellow wire going to 24 pin connector will melt, taking the connector with it.

Yes, you can use 1x -1x in any of the PCI-e slots.
Pages:
Jump to: