MaarioS wrote:
So, judging by explanations, I can add watts: 15+35+17=67W but what happens to the 4th one?? Should I multiply it by 0.8PF?? This results in 7.2W or I can multiply it by 1PF to be safe which results in 9W, meaning the total consumption is ~76W.
Well, worst case remember that volt-amps is always going to be greater than watts. So the safe number is 9W like you said.
MaarioS wrote:
In conclusion, I can assume I can plug 3-4 more AC adapters with same parameters and be safe but I should be aware plugging in more of them, right??
Yeah, just do the math on it.
MaarioS wrote:
Ah 1 more question: should I be concerned that there are slight differences in voltages like 100V AC input or 120V AC input??
I've heard of some cases where that matters with really cheap Japanese stuff and really crappy unregulated power supplies from the 1970's, and in that case sure you'll blow up an electrolytic capacitor that was already on its way out anyway. But for the most part don't worry about it. 120V or 110V or 100V is just the voltage it was rated at... That being said, a toaster rated at 120V would be somewhat disappointing if was ran at 100V and take more time than expected. A 100V toaster ran at 120V would be touchy and likely to burn the bread... maybe that's a better example?
I'd also say to be concerned about frequency, as certain transformers really, really want the 50 Hz or 60 Hz they're rated at, and become inefficient (as in heat up dramatically) if given a different frequency. Too much heat melts the insulation (lacquer, really) on transformer windings and if that happens it becomes permanently damaged (shorted) and could cause a fire worst case.
Okay, to sum this up, I'd bet if you gave a 60 Hz, 100 VAC device a supply voltage of 125 VAC at 50 Hz, you'd kill it within the hour at rated load. Also, it's okay to be suspicious of the 100V rated device, just check on it and cut power to it if it is heating up too much. If so, shelf it as it's not worth the risk, and try to maybe get a differently tapped transformer that puts out a lower voltage for it.
MaarioS wrote:
By the way, crap, you can blame me but what exactly is the power factor or better yet, how should I calculate it or measure it??
Power factor is related to how much a load device distorts a theoretical perfect sine wave voltage being fed to it. Power factor of a load is like a hidden cost in that it, surprise-surprise, makes one have to oversize a transformer more, in proportion to the worse (closer to zero) it gets.
A light bulb (incandescent with the really thin filament wire) is almost perfectly resistive, and has a power factor very close to one. In that all the energy that is given to it, it consumes.
For something like a shaded pole table top fan, you could have a power factor of 0.6, and that means there is interaction with the coils and rotating magnetic fields such that there is something called "reactive power" or stored energy that is bouncing back and forth in the wires in addition to the energy being directly used to rotate the fan blades and blow air. This extra oscillating energy has to be considered, as in if it's in this case a 40 Watt fan, there is probably 66 Volt-Amps flowing around. You only pay the power company for the 40 Watts, but the load that is acting on the nearest transformer (wherever it is) is 66 VA.
It's not really to one's benefit to "fix" the power factor of the 40 Watt fan to make it consume 66 Watts, as you're then just paying more per hour to run it. But... if you run equipment on the same circuit that likes nice sine wave AC input, then it will be detrimental in performance (audio equipment), or run less efficiently, or in extreme cases malfunction.
All I was just trying to say, is that one really can't really plug 200 Watts worth of real-world devices into a 200VA transformer. It has internal losses itself (remember 115*1.6=184VA available at the secondary, but that's decent as that's (184/200) = 92% efficiency), and the power factor of the loads also lowers the number of effective watts it can run by even more.