A Study of Reinforcement Learning Framework for Energy Management in Smart Grids: Integrating Market Trading, Load Forecasting, and Vertical Agents 


Vol. 51,  No. 2, pp. 324-347, Feb.  2026
10.7840/kics.2026.51.2.324


PDF Full-Text
  Abstract

This paper proposes an integrated reinforcement learning (RL) framework for optimizing energy management in smart and micro grids that addresses both real-time operations and day-ahead market trading. By designing reward functions that incorporate real-time market prices, grid demand, peak penalties, and forecasted load values, the framework directs optimal charging, discharging, or holding actions of a Battery Energy Storage System (BESS). A comprehensive battery model captures state-of-charge (SoC) dynamics with round-trip efficiency losses, cycle-based degradation using rainflow counting algorithms, and operational constraints including ramp rate limits. This physics-based degradation modeling, which accounts for nonlinear depth-of-discharge effects and electrochemical aging mechanisms (SEI growth, lithium plating, electrode stress), enables the RL agent to balance immediate energy arbitrage profits against long-term asset preservation through optimized shallow cycling strategies. The framework employs Proximal Policy Optimization (PPO) for stable multi-objective policy learning and integrates day-ahead load forecasting using Transformer models. A novel contribution is the application of vertical agents powered by small-scale large language models (sLLM) to translate RL decisions into executable schedules through an intuitive human-machine interface, bridging the gap between optimal policies and practical implementation.

  Statistics
Cumulative Counts from November, 2022
Multiple requests among the same browser session are counted as one view. If you mouse over a chart, the values of data points will be shown.


  Related Articles
  Cite this article

[IEEE Style]

K. Valiev, S. Ikromov, K. Young-il, "A Study of Reinforcement Learning Framework for Energy Management in Smart Grids: Integrating Market Trading, Load Forecasting, and Vertical Agents," The Journal of Korean Institute of Communications and Information Sciences, vol. 51, no. 2, pp. 324-347, 2026. DOI: 10.7840/kics.2026.51.2.324.

[ACM Style]

Koyiljon Valiev, Sukhrob Ikromov, and Kim Young-il. 2026. A Study of Reinforcement Learning Framework for Energy Management in Smart Grids: Integrating Market Trading, Load Forecasting, and Vertical Agents. The Journal of Korean Institute of Communications and Information Sciences, 51, 2, (2026), 324-347. DOI: 10.7840/kics.2026.51.2.324.

[KICS Style]

Koyiljon Valiev, Sukhrob Ikromov, Kim Young-il, "A Study of Reinforcement Learning Framework for Energy Management in Smart Grids: Integrating Market Trading, Load Forecasting, and Vertical Agents," The Journal of Korean Institute of Communications and Information Sciences, vol. 51, no. 2, pp. 324-347, 2. 2026. (https://doi.org/10.7840/kics.2026.51.2.324)
Vol. 51, No. 2 Index