The Ugly Truth of Demand Charges and How to Save Money
Updated: Feb 17
Let’s start by saying that demand charges aren’t a new invention.
In fact, demand charges have been around for over 100 years, since the early days of electricity grids. Demand charges are simply a way of charging those whose electricity use exceeds a normal amount (more on how it is calculated later).
The problem is, the technology used to supply and monitor power usage has progressed significantly over the last century, but the method of charging remains more or less the same.
Some might say that there’s no need to update the system. For years we’ve differentiated electric use (kilowatt-hours, kWh) and demand (kilowatts, kW). Isn’t this enough? Look at the water supply industry. They calculate water use from the amount of water that runs through the pipe and demand is calculated from the size of the pipe.
But there are fundamental differences between power and water supply. In this article, we’ll begin by describing how demand charges are calculated, why demand charges are seen to be unfair for some users, and look at ways to reduce and control demand charges.
The History of Demand Charges
Unbelievably, the first energy tariff ever calculated was based simply on the number of lightbulbs the user had!
Charging for electricity consumption used this rudimentary method until early energy suppliers such as Edison realized they had a problem. They were expected to provide a stable power supply for small residential users and large industrial users alike.
Initially, they began averaging the costs and charging each customer the same amount. But as more and more electric-powered machinery meant that industrial use escalated, it was soon viewed as unfair for small users.
Also, electric use was increasing across the board and more distribution capacity was required. Step in Arthur Wright, inventor of the “Demand Meter”. He discovered that by charging for the maximum amount of instantaneous demand (kW), while also considering the total consumption over time (kWh) he encouraged customers to spread out energy usage and reduce the capacity needs of the system, thereby reducing overall costs.
This invention led to the first demand charger meters, developed by Chicago Edison in 1897. Initially, the demand charge was approximately 20 cent/kW.
All in all, not much has changed over the 120 years since. We still use demand charger meters, although the tariff has risen considerably, year on year.
Changes in Technology, No Change in Charging
Recent years, in particular, have seen big technical improvements in energy metering and meter reading, with the introduction of smart meters. Since this development, many people are now seeing demand charges as unfair, as there should be a more accurate way to calculate usage and cost.
From an engineering point of view it makes sense to use the old method as it reduces the capacity needs of the system. From a user point of view, it seems unfair to pay high energy bills just because of a surge of high energy use during one 15 minute period in the month.
For example, EV users are hit hard under this system as they will create a large spike every night while recharging their vehicles.
Also, most utility companies specify the maximum power demand a customer is allowed per month. Exceeding that maximum power demand over consecutive months often results in the customer being moved to a tariff with a higher demand charge.
Demand charges can make a significant difference to monthly electric bills.
The basic formula to calculate a demand charge is:
X kW of demand * Y $/kW = $ Monthly Demand Charge
For example, if the rate includes demand charges set at $12 per kW, and the peak demand is 400 kW for the month, the demand charge would work out as :
400 kW * $12 = $4,800
Some areas have even introduced a mandatory residential demand charge. Massachusetts, for example, has imposed a demand charge since 2018.
How to Reduce Demand Charges
The good news is that everyone can “fight” demand charges.
There are three main steps to reduce demand charges: define, understand and reduce.
Define: Energy demand is usually driven by a short intensive use of energy. This can be caused by factors such as weather (more heating or cooling needed), or a large wave of users (e.g. many vehicles arriving at an EV charger at the same time, at a workplace for instance). You need to define what causes the demand and how you can react.
Understand: Use a monitoring system and simulate the demand for your site during different scenarios. Identify what is driving demand and calculate the financial impact of demand charges on your total operational costs. For example, at ampcontrol.io we use precise simulation tools to simulate the charging of EVs even before installing a new site
Reduce: Active management of demand can’t take place 24 hours. Instead, focus on the main peak activities and actively manage those times, once or twice per day or week. You only need to reduce the highest peak during the month or prevent your demand exceeding a certain limit.
Reducing demand is definitely worth doing. Just think, with demand chargers of 15 USD/kW, you save 18,000 USD per year when reducing your peak demand by 100 kW.
Reducing Demand for EV Charging Point Operators
At ampcontrol.io – Smart Charging solution for electric vehicles – we have developed a predictive monitoring and control system for charging point operators.
This intelligent system will save you money each month and saves you the time and effort of monitoring and controlling it yourself.
Follow us here to receive the next article!