Ajou University repository

A multi-use framework of energy storage systems using reinforcement learning for both price-based and incentive-based demand response programs
  • Oh, Seongmun ;
  • Kong, Junhyuk ;
  • Yang, Yejin ;
  • Jung, Jaesung ;
  • Lee, Chul Ho
Citations

SCOPUS

0

Citation Export

Publication Year
2022-01-01
Journal
International Journal of Electrical Power and Energy Systems
Publisher
Elsevier Ltd
Citation
International Journal of Electrical Power and Energy Systems, Vol.144
Keyword
Demand responseElectricity marketEnergy storage systemReinforcement learning
Mesh Keyword
Demand responseDemand response programsEnergy storage systemIncentive-based demand responseIndustrial customerPrice incentivesPrice-basedReinforcement learning agentReinforcement learningsStorage systems
All Science Classification Codes (ASJC)
Energy Engineering and Power TechnologyElectrical and Electronic Engineering
Abstract
This study proposes a multi-use energy storage system (ESS) framework to participate in both price-based and incentive-based demand response programs with reinforcement learning (RL) on the demand side. We focused on industrial customers, to provide them the opportunity to obtain additional profits through market participation in addition to managing their load. Since industrial customers pay their electricity bills according to the time of use tariff structure, they can benefit if they can shift their electricity usage from high-price hours to low-price hours. Furthermore, they are able obtain additional incentives by fulfilling a dispatch signal from the system operator by using the ESS. To model the multi-use ESS by industrial users, we used the RL framework to make customer decisions. The RL approach uses a control action policy by interacting with an environment with no prior knowledge. For this, we formulated the ESS operation as a Markov decision process so that the environmental information obtained by the customers provides RL, which takes optimal actions for the current environment considering customer benefits. We developed several RL agents to identify an acceptable control agent. We utilized the actual industrial load profile in South Korea to train the RL agents. The experimental results demonstrated that the proposed framework can make near-optimal decisions for using ESS in multiple ways.
ISSN
0142-0615
Language
eng
URI
https://aurora.ajou.ac.kr/handle/2018.oak/32845
https://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85135819508&origin=inward
DOI
https://doi.org/2-s2.0-85135819508
Journal URL
https://www.journals.elsevier.com/international-journal-of-electrical-power-and-energy-systems
Type
Article
Funding
This work was supported by the Korea Institute of Energy Technology Evaluation and Planning (KETEP) and the Ministry of Trade, Industry & Energy (MOTIE) of the Republic of Korea (No. 20191210301820).This work was supported by the Ajou University research fund.
Show full item record

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Jung, Jaesung  Image
Jung, Jaesung 정재성
Department of Electrical and Computer Engineering
Read More

Total Views & Downloads

File Download

  • There are no files associated with this item.