Given the rapid growth of Electric vehicles (EVs) and their finite battery life, improving on-road charging strategies presents significant challenges, including customer dissatisfaction due to long waiting times and the dynamic nature of the traffic data can lead to choosing suboptimal charging station. Traditional approaches primarily depend on fixed road data; our model utilizes real-time traffic data from Google Maps to capture live congestion patterns and dynamically optimize charging assignments for the vehicles to minimize costs and improve overall driver satisfaction. This study focuses on three fundamental questions for the optimal on-road charging of a fleet of EVs: (i) when is the best time to charge the vehicle; (ii) where is the optimal charging location for each EV; and (iii) how should charging be planned considering the condition of the road and battery level. Our objective is to determine the optimal time and location for electric vehicle charging that minimizes the weighted sum of travel and charging time, charging cost, and associated penalties for late charging and station overutilization, while taking into account key factors such as real-time traffic conditions, the spatial distribution of charging stations, and EV-specific attributes such as state of charge (SOC), driving range, and travel efficiency. To develop a robust and adaptive EV charging recommendation system, we employ Multi-Agent Reinforcement Learning (MARL) to derive an intelligent, self-improving charging strategy that dynamically adapts to changing situations. Numerical simulations demonstrate that our model provides an applicable and scalable solution for EV users and urban planners, contributing to more efficient and intelligent EV charging infrastructure. We also compared the results of our MARL model with an exact approach in small-scale using Gurobi package in Python.



