Accelerating Learning-based Load-Shedding Scheme with Fast Dynamic Simulation in Python
Deep reinforcement learning (DRL) in Artificial Intelligence (AI) has been introduced to predict and control the behavior of complex power systems. To explore the parameter space of the control policy effectively and be adaptive to multiple fault scenarios in the power system, these algorithms need...
        Saved in:
      
    
          | Published in | IEEE Power & Energy Society General Meeting pp. 1 - 5 | 
|---|---|
| Main Authors | , , | 
| Format | Conference Proceeding | 
| Language | English | 
| Published | 
            IEEE
    
        21.07.2024
     | 
| Subjects | |
| Online Access | Get full text | 
| ISSN | 1944-9933 | 
| DOI | 10.1109/PESGM51994.2024.10689215 | 
Cover
| Summary: | Deep reinforcement learning (DRL) in Artificial Intelligence (AI) has been introduced to predict and control the behavior of complex power systems. To explore the parameter space of the control policy effectively and be adaptive to multiple fault scenarios in the power system, these algorithms need to perform many power system dynamic simulations (DS) by inferring a sufficient number of different perturbed policies at each iteration of the training. DS is critical to identify physical stability constraints with inevitable system component failures. Accelerating individual DS tasks in the DRL training is important in boosting training efficiency. Therefore, a fast simulator that fulfills the highly efficient learning objectives is necessary. This paper discusses the accelerated DRL algorithm powered by a robust simulation model, which implements the reduced Y method. To eliminate the data transfer between simulation and learning process written in different computing languages and achieve computing environment consistency, the application is developed purely in Python. With advanced mathematical optimization and implementations, the training time with nine scenarios and 300 iterations can achieve a 1.46x speedup compared to the original approach. | 
|---|---|
| ISSN: | 1944-9933 | 
| DOI: | 10.1109/PESGM51994.2024.10689215 |