This paper presents a degradation-aware reinforcement learning (RL) framework for real-time energy management in residential microgrids, focusing on optimizing lithium-ion battery usage while balancing economic benefits and battery longevity. We employ the Soft ActorCritic (SAC) algorithm, implemented via Stable Baselines3, to learn non-linear dispatch policies for a 5.2 kWh LiCoO2 battery pack, with degradation modeled using a simplified energy-throughput approach calibrated with NASA dataset measurements. The framework is tested across diverse household profiles over 1-year and 10-year simulations. Results show that RL-SAC outperforms a Model Predictive Control (MPC) baseline, extending battery life and reducing energy purchases in both simulations. These findings highlight RL-SAC’s potential for practical deployment in microgrids, offering a scalable solution for sustainable energy management.

Degradation-Aware Energy Management in Residential Microgrids: A Reinforcement Learning Framework / Zendehdel, Danial; Ferro, Gianluca; De Santis, Enrico; Rizzi, Antonello. - (2026). (Intervento presentato al convegno 17th International Joint Conference on Computational Intelligence, IJCCI 2025 tenutosi a Marbella, Spain).

Degradation-Aware Energy Management in Residential Microgrids: A Reinforcement Learning Framework

Danial Zendehdel
;
Gianluca Ferro;Enrico De Santis;Antonello Rizzi
2026

Abstract

This paper presents a degradation-aware reinforcement learning (RL) framework for real-time energy management in residential microgrids, focusing on optimizing lithium-ion battery usage while balancing economic benefits and battery longevity. We employ the Soft ActorCritic (SAC) algorithm, implemented via Stable Baselines3, to learn non-linear dispatch policies for a 5.2 kWh LiCoO2 battery pack, with degradation modeled using a simplified energy-throughput approach calibrated with NASA dataset measurements. The framework is tested across diverse household profiles over 1-year and 10-year simulations. Results show that RL-SAC outperforms a Model Predictive Control (MPC) baseline, extending battery life and reducing energy purchases in both simulations. These findings highlight RL-SAC’s potential for practical deployment in microgrids, offering a scalable solution for sustainable energy management.
2026
17th International Joint Conference on Computational Intelligence, IJCCI 2025
reinforcement learning; battery management system; energy management; lithium-ion batteries; degradation modeling; microgrid
04 Pubblicazione in atti di convegno::04b Atto di convegno in volume
Degradation-Aware Energy Management in Residential Microgrids: A Reinforcement Learning Framework / Zendehdel, Danial; Ferro, Gianluca; De Santis, Enrico; Rizzi, Antonello. - (2026). (Intervento presentato al convegno 17th International Joint Conference on Computational Intelligence, IJCCI 2025 tenutosi a Marbella, Spain).
File allegati a questo prodotto
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11573/1749871
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact