This study proposes a multi-use energy storage system (ESS) framework to participate in both price-based and incentive-based demand response programs with reinforcement learning (RL) on the demand side. We focused on industrial customers, to provide them the opportunity to obtain additional profits through market participation in addition to managing their load. Since industrial customers pay their electricity bills according to the time of use tariff structure, they can benefit if they can shift their electricity usage from high-price hours to low-price hours. Furthermore, they are able obtain additional incentives by fulfilling a dispatch signal from the system operator by using the ESS. To model the multi-use ESS by industrial users, we used the RL framework to make customer decisions. The RL approach uses a control action policy by interacting with an environment with no prior knowledge. For this, we formulated the ESS operation as a Markov decision process so that the environmental information obtained by the customers provides RL, which takes optimal actions for the current environment considering customer benefits. We developed several RL agents to identify an acceptable control agent. We utilized the actual industrial load profile in South Korea to train the RL agents. The experimental results demonstrated that the proposed framework can make near-optimal decisions for using ESS in multiple ways.
This work was supported by the Korea Institute of Energy Technology Evaluation and Planning (KETEP) and the Ministry of Trade, Industry & Energy (MOTIE) of the Republic of Korea (No. 20191210301820).This work was supported by the Ajou University research fund.