Load factor is always a value between zero and one, and can be thought of as a percentage that is applied to volt-amps to determine actual available power (watts) for a given device in a typical AC circuit. In a purely resistive circuit, or a DC circuit, wattage and volt-amps are the same and are calculated by multiplying voltage and current together (V x C = W). The problem with most typical real world AC circuits is the fact that most loads are somewhat inductive and will “see” the voltage as somewhat out of phase with current. As we know from audio, signals that are not entirely in phase with one another don’t fully add together (or can even subtract from one another). This same phenomenon occurs in AC signals as well. Though the principle isn’t exactly the same as adding different signals together (voltage and current are both components of one signal) the load factor accounts for the fact that you don’t always get the actual amount of watts (which do the real work) from your voltage and current. Therefore actual usable wattage is always some amount lower than VA. The math would look like:
Volts x amps = VA (volt-amps) x Load Factor = Watts used.
What all of this means is that it can sometimes take more current than might be apparent to operate a given device because of inefficiencies caused by its load factor. Electric motors, for example, present notoriously inductive loads and will quite often require much more current to operate (particularly starting up) than would be implied by their wattage ratings. Any time a very precise determination of power usage must be calculated, such as in configuring electrical distribution or UPS systems, it is important to take the effects of each device’s load factor into account.