DETSOC Transactions on Industry Applications


DETSOC Transactions on Industry Applications

Formulation and Simulation of Optimal Energy Mix Model Using Reinforcement Learning Methods With a Test Case on Philippine Monthly Energy Utilization Data for 2020-2022

Reynaldo Ted L Peñas II (University of the Philippines & DOST-ASTI, Philippines)
Prospero C. Naval, Jr. (University of the Philippines, Philippines )
Publisher: Dubai Electro Technical Society
PDF (Please First Login for Download)

Meet the Editor

Editor-in-Chief

Dr. Mohsen Imanieh 
editor-in-chief@detsoc.org


Abstract:

—Traditional methods for generating an optimal energy mix in the Philippines involve selecting from various power producers, each with different costs per kilowatt-hour (kWh). This energy is then transmitted and distributed among various end-user sectors. This study explores a novel approach by leveraging reinforcement learning (RL) techniques to optimize the energy mix. The main objective is to establish a model of energy mix that could be optimized when appropriate Reinforcement Learning methods are employed. The subsequent specific objective for the application is to determine an optimal energy mix over a 36-month period (2020-2022) in the Philippines using the established model. The RL techniques evaluated include Q-learning, Deep Q-Network (DQN), Double Q-learning, and Actor-Critic algorithms, each tested with different learning rates (Alpha) and discount factors (gama). The data used encompasses monthly energy generated (in GWh) from various sources (coal, oil-based, natural gas, and renewables), monthly energy consumption (in GWh) by sector (residential, commercial, and industrial), and the average monthly price (in Philippine Peso, PhP) across the country. The Actor-Critic algorithm with alpha = 0.10 and gama = 0.90 emerged as the most effective in optimizing the energy mix. The simulations showed that all RL algorithms met the energy demand without any shortages, demonstrating their potential in enhancing energy management by continuously adapting to changes in energy production and consumption patterns. This approach not only automates decision-making processes but also improves efficiency by utilizing historical data and addressing the fluctuating nature of energy markets.
Published in: Dubai Electro Technical Society Transactions on Industry Applications ( Volume: 01, Issue: 1, Sept. 2024)
Authors:
1. Reynaldo Ted L Peñas II (University of the Philippines & DOST-ASTI, Philippines)
2. Prospero C. Naval, Jr. (University of the Philippines, Philippines )
References:
1. Penas, R.T.L. II, Naval, P.C. Jr., "Optimal Energy Mix Modeling Simulation Using Reinforcement Learning on Philippine Monthly Energy Utilization for 2020-2022," 2024 6th International Conference on Electrical, Control and Instrumentation Engineering (ICECIE), Pattaya, Thailand, 2024, doi: 10.1109/ICECIE63774.2024.10815656.
2. Yap, J.T., Gabriola, A.J.P., and Herrera, C.F., “Managing the energy trilemma in the Philippines,” Energy Sustain Soc 11, 34 (2021) https://doi.org/10.1186/s13705-021-00309-1, 2021.
3. Marzbani, F., Abdelfatah, A., “Economic Dispatch Optimization Strategies and Problem Formulation: A Comprehensive Review,” Energies 2024, 17, 550. https://doi.org/10.3390/en17030550, 2024.
4. May, G. et al., “Energy management in manufacturing: From literature review to a conceptual framework,” Journal of cleaner production 167 (2017): 1464-1489, 2017.
5. Zhou, X. et al., “Optimization of building demand flexibility using reinforcement learning and rule-based expert systems,” Applied Energy 350 (2023): 121792, 2023.
6. Husin, H. and Zaki, M., “A critical review of the integration of renewable energy sources with various technologies,” Protection and control of modern power systems 6.1 (2021): 1-18, 2021.
7. Khan, U.N., “Short-term load forecasting by using artificial neural networks,” Diss. Bahçeehir Universitesi Fen Bilimleri Enstitusu, 2018.
8. Weron, R., “Electricity price forecasting: A review of the state-of-the-art with a look into the future,” International journal of forecasting 30.4 (2014): 1030-1081, 2014.
9. Kramer, T.A., Staubitz, J.M., and Lehnhoff, S., “Learning optimal energy management strategies for a virtual power plant using reinforcement learning,” IEEE Transactions on Smart Grid, vol. 8, no. 6, pp. 2841-2851, Nov. 2017.
10. Vieira, L.C., Longo, M., Mura, M., “Are the European manufacturing and energy sectors on track for achieving net-zero emissions in 2050? An empirical analysis,” Energy Policy, Volume 156, 2021, ISSN 0301- 4215, doi: 10.1016/j.enpol.2021.112464, 2021.
11. Chertkov, M., et al., “Reinforcement learning for dynamic pricing in smart grid,” IEEE Transactions on Smart Grid, vol. 6 no. 6 pp. 2976- 2984 Nov. 2015.
12. Xu, C. et al., “Deep reinforcement learning-based optimal power allocation for green cooperative vehicular networks,” IEEE Transactions on Intelligent Transportation Systems, vol. 21, no. 11, pp. 4849-4862, Nov. 2020.
13. Department of Energy, https://www.doe.gov.ph, 2023.
14. Meralco Online, www.company.meralco.com.ph, 2023.
Page(s): 51-56
Date of Publication: 19 September 2024
Publisher: DETSOC
ISSN: 3079-3025

Meet the Editor

Editor-in-Chief

Dr. Mohsen Imanieh 
editor-in-chief@detsoc.org