No, current doesn't drop unless you're hooked up to a resistor. A power supply is (close to) a constant power device, so current will increase in proportion to the voltage drop. It will be pretty minor though.
The sole job of resistance is to
oppose current [DC or AC]. Also, resistance will
not drop (consume; eat) any voltage unless current is flowing through it. Being that AC will flow ON or THROUGH a capacitor no matter if the capacitor is open (normal) or shorted (abnormal); voltage is dropped by the resistance in the metal plates of the capacitor (line cord). However, I will not continue in the "science" of electrons flowing on/through a capacitor. I will simply provide a table below for current ratings in wire other than magnet wire.
Current Ratings:Most current ratings for wires (except magnet wires) are based on permissible voltage drop, not temperature rise. For example, 0.5 mm^2 wire is rated at 3A in some applications but will carry over 8 A in free air without overheating. You will find tables of permitted maximum current in national electrical codes,
but these are based on voltage drop (not the heating which is no problem in the current rating those codes give). Which I say again, "the wire gauge (AWG) used is more important than the length [when looking at power cords for peripherals].
Here is a small current and AWG table taken from the Amateur Radio Relay Handbook, 1985.
AWG dia mils circ mils open air A cable Amp ft/lb bare ohms/1000'
10 101.9 10380 55 33 31.82 1.018
12 80.8 6530 41 23 50.59 1.619
14 64.1 4107 32 17 80.44 2.575
Mils are .001". "open air A" is a continuous rating for a single conductor with insulation in open air. "cable amp" is for in multiple conductor cables. Disregard the amperage ratings for household use.
To calculate voltage drop, plug in the values: V = DIR/1000
Where I is the amperage, R is from the ohms/1000' column above, and D is the total distance the current travels (
don't forget to add the length of the neutral and hot together - ie: usually double cable length).
Design rules in the CEC call for a maximum voltage drop of 6% (7V on 120V circuit).What I'm arguing: is the AMOUNT of the resistance in a 14 AWG power line cord at 12 feet in length will not get as hot as a 16 AWG power line cord of the same length in a circuit running at 11.666667 AMPS (Bitmain Antminer S4 @ 1400 watts = 11.666667 AMPS). This is THE SAME THING you pointed out in your statement I quoted below.
WE AGREE!!! GAUGE (AWG) IS MORE IMPORTANT THAN LENGTH; since the length we are talking about is really not very long at all.
Cable heating won't really change, since while you'll have 3.3x the voltage drop with a 10ft vs 3ft cable and essentially the same current and will therefore dissipate 3.3x more power, the cable also has 3.3x more surface area to dissipate heat. That's why wire gauges are rated for a certain current without regard for length.
MY SENTIMENTS EXACTLY