Ajou University repository

A multi-use framework of energy storage systems using reinforcement learning for both price-based and incentive-based demand response programs
  • Oh, Seongmun ;
  • Kong, Junhyuk ;
  • Yang, Yejin ;
  • Jung, Jaesung ;
  • Lee, Chul Ho
Citations

SCOPUS

0

Citation Export

DC Field Value Language
dc.contributor.authorOh, Seongmun-
dc.contributor.authorKong, Junhyuk-
dc.contributor.authorYang, Yejin-
dc.contributor.authorJung, Jaesung-
dc.contributor.authorLee, Chul Ho-
dc.date.issued2022-01-01-
dc.identifier.issn0142-0615-
dc.identifier.urihttps://aurora.ajou.ac.kr/handle/2018.oak/32845-
dc.identifier.urihttps://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85135819508&origin=inward-
dc.description.abstractThis study proposes a multi-use energy storage system (ESS) framework to participate in both price-based and incentive-based demand response programs with reinforcement learning (RL) on the demand side. We focused on industrial customers, to provide them the opportunity to obtain additional profits through market participation in addition to managing their load. Since industrial customers pay their electricity bills according to the time of use tariff structure, they can benefit if they can shift their electricity usage from high-price hours to low-price hours. Furthermore, they are able obtain additional incentives by fulfilling a dispatch signal from the system operator by using the ESS. To model the multi-use ESS by industrial users, we used the RL framework to make customer decisions. The RL approach uses a control action policy by interacting with an environment with no prior knowledge. For this, we formulated the ESS operation as a Markov decision process so that the environmental information obtained by the customers provides RL, which takes optimal actions for the current environment considering customer benefits. We developed several RL agents to identify an acceptable control agent. We utilized the actual industrial load profile in South Korea to train the RL agents. The experimental results demonstrated that the proposed framework can make near-optimal decisions for using ESS in multiple ways.-
dc.description.sponsorshipThis work was supported by the Korea Institute of Energy Technology Evaluation and Planning (KETEP) and the Ministry of Trade, Industry & Energy (MOTIE) of the Republic of Korea (No. 20191210301820).-
dc.description.sponsorshipThis work was supported by the Ajou University research fund.-
dc.language.isoeng-
dc.publisherElsevier Ltd-
dc.subject.meshDemand response-
dc.subject.meshDemand response programs-
dc.subject.meshEnergy storage system-
dc.subject.meshIncentive-based demand response-
dc.subject.meshIndustrial customer-
dc.subject.meshPrice incentives-
dc.subject.meshPrice-based-
dc.subject.meshReinforcement learning agent-
dc.subject.meshReinforcement learnings-
dc.subject.meshStorage systems-
dc.titleA multi-use framework of energy storage systems using reinforcement learning for both price-based and incentive-based demand response programs-
dc.typeArticle-
dc.citation.titleInternational Journal of Electrical Power and Energy Systems-
dc.citation.volume144-
dc.identifier.bibliographicCitationInternational Journal of Electrical Power and Energy Systems, Vol.144-
dc.identifier.doi2-s2.0-85135819508-
dc.identifier.scopusid2-s2.0-85135819508-
dc.identifier.urlhttps://www.journals.elsevier.com/international-journal-of-electrical-power-and-energy-systems-
dc.subject.keywordDemand response-
dc.subject.keywordElectricity market-
dc.subject.keywordEnergy storage system-
dc.subject.keywordReinforcement learning-
dc.type.otherArticle-
dc.description.isoafalse-
dc.subject.subareaEnergy Engineering and Power Technology-
dc.subject.subareaElectrical and Electronic Engineering-
Show simple item record

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Jung, Jaesung  Image
Jung, Jaesung 정재성
Department of Electrical and Computer Engineering
Read More

Total Views & Downloads

File Download

  • There are no files associated with this item.