Abstract
Vehicular Ad Hoc Networks (VANETs) play a crucial role in road safety and the deployment of Intelligent Transportation Systems (ITS). However, their highly dynamic nature and the continuous growth of data exchanges make them particularly vulnerable to intrusions and abnormal behaviors, which can compromise the reliability of communications. Traditional anomaly detection approaches, often relying on neural network algorithms or machine learning techniques, show limitations when faced with the diversity and rapid evolution of attacks. In this context, generative artificial intelligence offers a new perspective. By learning the normal distribution of network behaviors and generating realistic representations of traffic flows or patterns, generative models, such as Variational Autoencoders (VAEs) and Generative Adversarial Networks (GANs), can identify subtle deviations that may indicate potential attacks or anomalies. To enhance the trustworthiness of these systems, the integration of explainability mechanisms (XAI) enables the interpretation of model decisions and provides insights into why certain behaviors are classified as anomalous, thereby facilitating validation by cybersecurity experts and supporting real-time decision-making. The objective of this research is to explore the integration of generative AI into proactive anomaly detection mechanisms within VANETs, with the aim of improving the resilience, reliability, and security of vehicular communications.
Bio
Hela Marouane is currently an Associate Professor at the engineering school ESME-Sudria. She obtained a Master’s degree in Informatics and Multimedia from the University of Sfax in 2010, and a PhD in Information Technology from the University of Le Havre and the University of Sfax in 2015. Before joining ESME, she worked as an Associate Professor at the Institute of Informatics and Multimedia.
You can participate either online or in person at the LRE Open Space. For in-person participants, please register via https://forms.office.com/e/5bLjTkQtyf before 2025-09-14, for organisational reasons.