Ajou University repository

Multi-Agent Distributed DQN and Transfer Learning for Energy-Efficient Power Management in Solar Energy-Harvested Small-Cell Networks
  • Cho, Hyebin ;
  • Kim, Hyungsub ;
  • Na, Jee Hyeon ;
  • Lim, Seung Chan ;
  • Lee, Howon
Citations

SCOPUS

2

Citation Export

DC Field Value Language
dc.contributor.authorCho, Hyebin-
dc.contributor.authorKim, Hyungsub-
dc.contributor.authorNa, Jee Hyeon-
dc.contributor.authorLim, Seung Chan-
dc.contributor.authorLee, Howon-
dc.date.issued2025-01-01-
dc.identifier.issn2327-4662-
dc.identifier.urihttps://aurora.ajou.ac.kr/handle/2018.oak/38520-
dc.identifier.urihttps://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85218922380&origin=inward-
dc.description.abstractThe integration of solar energy harvesting into small-cell networks is a promising solution for achieving energy-efficient and sustainable wireless communications. However, the inherent variability and intermittency of solar energy, coupled with precise inter-cell interference management, significantly hinder efficient network operation. To resolve these challenges, we propose a multi-agent distributed deep Q-network (MA-DDQN) framework, where the distributed base stations learn the optimal transmit power control policies. To further enhance adaptability under varying solar conditions, we present a daily model transfer with a fine-tuning approach, enabling efficient deployment without extensive training overhead. Simulation results demonstrate that the proposed methods remarkably improve energy efficiency while maintaining robust adaptability under dynamic solar conditions, revealing their potential for sustainable small-cell network deployments.-
dc.language.isoeng-
dc.publisherInstitute of Electrical and Electronics Engineers Inc.-
dc.subject.meshCondition-
dc.subject.meshEfficient power managements-
dc.subject.meshEnergy-
dc.subject.meshEnergy efficient-
dc.subject.meshMulti agent-
dc.subject.meshMulti-agent reinforcement learning-
dc.subject.meshSmall cell Networks-
dc.subject.meshSolar energy harvesting-
dc.subject.meshTransfer learning-
dc.subject.meshWireless communications-
dc.titleMulti-Agent Distributed DQN and Transfer Learning for Energy-Efficient Power Management in Solar Energy-Harvested Small-Cell Networks-
dc.typeArticle-
dc.citation.titleIEEE Internet of Things Journal-
dc.identifier.bibliographicCitationIEEE Internet of Things Journal-
dc.identifier.doi10.1109/jiot.2025.3545027-
dc.identifier.scopusid2-s2.0-85218922380-
dc.identifier.urlhttp://ieeexplore.ieee.org/servlet/opac?punumber=6488907-
dc.subject.keywordEnergy efficiency-
dc.subject.keywordmulti-agent reinforcement learning-
dc.subject.keywordsmall-cell networks-
dc.subject.keywordsolar energy harvesting-
dc.subject.keywordtransfer learning-
dc.type.otherArticle-
dc.identifier.pissn23274662-
dc.description.isoafalse-
dc.subject.subareaSignal Processing-
dc.subject.subareaInformation Systems-
dc.subject.subareaHardware and Architecture-
dc.subject.subareaComputer Science Applications-
dc.subject.subareaComputer Networks and Communications-
Show simple item record

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Lee, Howon Image
Lee, Howon이호원
Department of Electrical and Computer Engineering
Read More

Total Views & Downloads

File Download

  • There are no files associated with this item.