Citation Export
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Cho, Hyebin | - |
| dc.contributor.author | Kim, Hyungsub | - |
| dc.contributor.author | Na, Jee Hyeon | - |
| dc.contributor.author | Lim, Seung Chan | - |
| dc.contributor.author | Lee, Howon | - |
| dc.date.issued | 2025-01-01 | - |
| dc.identifier.issn | 2327-4662 | - |
| dc.identifier.uri | https://aurora.ajou.ac.kr/handle/2018.oak/38520 | - |
| dc.identifier.uri | https://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85218922380&origin=inward | - |
| dc.description.abstract | The integration of solar energy harvesting into small-cell networks is a promising solution for achieving energy-efficient and sustainable wireless communications. However, the inherent variability and intermittency of solar energy, coupled with precise inter-cell interference management, significantly hinder efficient network operation. To resolve these challenges, we propose a multi-agent distributed deep Q-network (MA-DDQN) framework, where the distributed base stations learn the optimal transmit power control policies. To further enhance adaptability under varying solar conditions, we present a daily model transfer with a fine-tuning approach, enabling efficient deployment without extensive training overhead. Simulation results demonstrate that the proposed methods remarkably improve energy efficiency while maintaining robust adaptability under dynamic solar conditions, revealing their potential for sustainable small-cell network deployments. | - |
| dc.language.iso | eng | - |
| dc.publisher | Institute of Electrical and Electronics Engineers Inc. | - |
| dc.subject.mesh | Condition | - |
| dc.subject.mesh | Efficient power managements | - |
| dc.subject.mesh | Energy | - |
| dc.subject.mesh | Energy efficient | - |
| dc.subject.mesh | Multi agent | - |
| dc.subject.mesh | Multi-agent reinforcement learning | - |
| dc.subject.mesh | Small cell Networks | - |
| dc.subject.mesh | Solar energy harvesting | - |
| dc.subject.mesh | Transfer learning | - |
| dc.subject.mesh | Wireless communications | - |
| dc.title | Multi-Agent Distributed DQN and Transfer Learning for Energy-Efficient Power Management in Solar Energy-Harvested Small-Cell Networks | - |
| dc.type | Article | - |
| dc.citation.title | IEEE Internet of Things Journal | - |
| dc.identifier.bibliographicCitation | IEEE Internet of Things Journal | - |
| dc.identifier.doi | 10.1109/jiot.2025.3545027 | - |
| dc.identifier.scopusid | 2-s2.0-85218922380 | - |
| dc.identifier.url | http://ieeexplore.ieee.org/servlet/opac?punumber=6488907 | - |
| dc.subject.keyword | Energy efficiency | - |
| dc.subject.keyword | multi-agent reinforcement learning | - |
| dc.subject.keyword | small-cell networks | - |
| dc.subject.keyword | solar energy harvesting | - |
| dc.subject.keyword | transfer learning | - |
| dc.type.other | Article | - |
| dc.identifier.pissn | 23274662 | - |
| dc.description.isoa | false | - |
| dc.subject.subarea | Signal Processing | - |
| dc.subject.subarea | Information Systems | - |
| dc.subject.subarea | Hardware and Architecture | - |
| dc.subject.subarea | Computer Science Applications | - |
| dc.subject.subarea | Computer Networks and Communications | - |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.