What is load factor used for in power system?
Load factor in electrical context is defined as the ratio between the Average Load of the electrical system to the Maximum Demand that it attained. This is factor is used to determine the level of difference between the average load to the peak load. Like the Demand Factor, Load factor is also expressed in percentage form.
Since losses increase with the square of the load current (I)squared and since the cost of losses is highest at the peak load periods, the relationship between the peak load on a system component and its average loading is important in any study of losses on that component that is why load factor is a key factor used to describe the relationship between peak and average loads and is calculated in the above shown formula. (TVPPA, nov.1994)
The ideal value for Load factor is 100%, this means that the consumed load by the system is just enough for the capacity to cater. Having a low Load factor means the system capacity is not maximized for the reason that the difference between the average load and the peak demand is high. This translates to the utility putting up capacity that is most of the time idle.
A certain mall which is a contestable customer was read with an average monthly load of 277.8kW while the maximum registered meter demand reached 380 kW. So, what is their load factor?
LF = (Average Load) / (Maximum Demand) = 277.8kW/380kW = 73.1%