Catalog Home Page

Energy management strategy for electric vehicles based on deep Q-learning using Bayesian optimization

Kong, H., Yang, J., Wang, H.ORCID: 0000-0003-2789-9530 and Fan, L. (2019) Energy management strategy for electric vehicles based on deep Q-learning using Bayesian optimization. Neural Computing and Applications . In Press.

Link to Published Version: https://doi.org/10.1007/s00521-019-04556-4
*Subscription may be required

Abstract

In this paper, a deep Q-learning (DQL)-based energy management strategy (EMS) is designed for an electric vehicle. Firstly, the energy management problem is reformulated to satisfy the condition of employing DQL by considering the dynamics of the system. Then, to achieve the minimum of electricity consumption and the maximum of the battery lifetime, the DQL-based EMS is designed to properly split the power demand into two parts: one is supplied by the battery and the other by supercapacitor. In addition, a hyperparameter tuning method, Bayesian optimization (BO), is introduced to optimize the hyperparameter configuration for the DQL-based EMS. Simulations are conducted to validate the improvements brought by BO and the convergence of DQL algorithm equipped with tuned hyperparameters. Simulations are also carried out on both training dataset and the testing dataset to validate the optimality and the adaptability of the DQL-based EMS, where the developed EMS outperforms a previously published rule-based EMS in almost all the cases.

Item Type: Journal Article
Murdoch Affiliation: College of Science, Health, Engineering and Education
Publisher: Springer London
Copyright: © 2019 Springer-Verlag London Ltd., part of Springer Nature
URI: http://researchrepository.murdoch.edu.au/id/eprint/52934
Item Control Page Item Control Page