Ex: there is a line that powers 3 light fixtures, and 2 receptacles. 110 volt, 15 amps = 1650 watts max. You want to stay less than 80% of that, so say 1200 ish to be safe. when your 3 lights are turned on they each use 100 watts so you have 1200 - 300 = 900 left to use on the receptacles of that line. If you have a rig of say 6*1060s running at 80 apiece, + a bit for cpu/mobo/fans on say an 850 watt psu, you are good. You don't want to use all of the psu capacity, leave lots of overhead there too.
That is just an example--but I and many others here would recommend a bit of time and research and running #s so you can run your rigs safely, as there've been some bad incidents lately in the news in apartment buildings. Good luck. There are some electicians on here who can weigh in too.
modern LED lights only use 7 to 10 watts each. even the older spiral bulbs are maybe 20w. no one uses incandescent in the US anymore
best thing for someone to do is measure with a kill a watt then calculate what else is on the line. my advice is turn off a breaker you plan on using and see what else dies. if it's just lights. 100w is like 10 lightbulbs which means the rig could have a max draw of 1200-1300w to stay at 80%