Cyber Attacks Detection And Mitigation Using Machine Learning In Smart Grid Systems
DOI:
https://doi.org/10.70008/jeser.v1i01.43Keywords:
Cybersecurity, Smart Grid, Machine Learning, Intrusion DetectionAbstract
The increasing complexity and interconnectedness of smart grid systems have heightened their vulnerability to cyber threats, necessitating advanced solutions for securing these critical infrastructures. This study aims to explore the potential of AI-driven predictive analytics in enhancing the cybersecurity and operational efficiency of smart grids. By leveraging the systematic approach of the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines, a comprehensive review of 1,526 initial articles was conducted, which was subsequently narrowed down through rigorous screening and evaluation to a final set of 127 high-quality studies. The review reveals that machine learning models, such as neural networks and time series forecasting, significantly enhance the early detection and mitigation of cyber threats, allowing grid operators to take proactive measures to safeguard against disruptions. Additionally, the integration of AI with real-time data analytics was found to optimize load forecasting, predict equipment failures, and improve overall grid resilience, leading to reduced downtime and operational costs. However, significant challenges related to data privacy, scalability, and integration with legacy systems persist. The findings suggest that techniques like federated learning and blockchain integration could address these challenges, though further research is needed to enhance model robustness and efficiency. This review provides critical insights into the practical applications of AI in smart grid cybersecurity, highlighting both the benefits and the barriers that must be overcome to achieve widespread adoption.