What is it about?

After reactive power compensation, voltage-dependent loads, such as constant impedance and constant current loads, consume more power because of the increase in node voltage; therefore, customers pay more for their electricity while utilities experience savings from line-loss reduction. Here, a voltage-reduction strategy plays an important role in reducing total energy consumption during reactive power compensation.

Featured Image

Why is it important?

In this paper, we have presented a rationale for the necessity of reducing voltage during reactive power compensation and determined the optimal voltage setting at the substation regulator. We have also analyzed the joint effects of ambient temperature, price, size, and phase kVAr of the capacitor on line loss and load demand using a 24 factorial design.

Read the Original

This page is a summary of: Importance of Voltage Reduction and Optimal Voltage Setting During Reactive Power Compensation, IEEE Transactions on Power Delivery, August 2014, Institute of Electrical & Electronics Engineers (IEEE),
DOI: 10.1109/tpwrd.2014.2306194.
You can read the full text:

Read

Contributors

The following have contributed to this page