Most PCIe cables are AWG16 (better ones are AWG14) and they're rated at 13 amps. 13A * 12V = 156W. That's the maximum power that's safe to draw using those cables. If you use 1 cable for 2 cards you're asking for trouble, those cables might catch fire.
not like this.
PEIe cables with 8Pin was 3x12v, so 13*12*3=450w.
Are you sure AWG16 can run with 13A?
I check some data , AWG16 max is about 6A.
so one PEIe with 8pin can only drive 6*12*3=216w.
so this is why 1080ti must 8+6pin or 8+8pin
You're right and wrong and I was wrong
AWG16 max current is 3,7A as per https://www.powerstream.com/Wire_Size.htm
My first post was from my head and you're right I didn't take the 3 power leads into consideration. So the correct math is 3,7*3*12=133W per cable.
I don't know where they got their numbers from (powerstream link above), but it's way off and much lower than expected. Would not trust it because they don't specify temps, insulation type or core count.
Here's some better numbers: https://www.engineeringtoolbox.com/wire-gauges-d_419.html
Note that your typical (high quality) AWG16 wire would have about 10-15 cores and PVC insulation, so according to chart, it will be fine carrying around 7.0A at 30C ambient. Multiply that by 3x for your typical PCIe power cable and it will result in ~250W at 12V. In practice, I run each 1080ti at 275W powered with one 6-pin PCIe cable + a splitter at the end.
-scsi