Murdoch University Research Repository

Welcome to the Murdoch University Research Repository

The Murdoch University Research Repository is an open access digital collection of research
created by Murdoch University staff, researchers and postgraduate students.

Learn more

Deep additive least squares support vector machines for classification with model transfer

Wang, G.ORCID: 0000-0002-5258-0532, Zhang, G., Choi, K-S and Lu, J. (2019) Deep additive least squares support vector machines for classification with model transfer. IEEE Transactions on Systems, Man, and Cybernetics: Systems, 49 (7). pp. 1527-1540.

Link to Published Version:
*Subscription may be required


The additive kernel least squares support vector machine (AK-LS-SVM) has been well used in classification tasks due to its inherent advantages. For example, additive kernels work extremely well for some specific tasks, such as computer vision classification, medical research, and some specialized scenarios. Moreover, the analytical solution using AK-LS-SVM can formulate leave-one-out cross-validation error estimates in a closed form for parameter tuning, which drastically reduces the computational cost and guarantee the generalization performance especially on small and medium datasets. However, AK-LS-SVM still faces two main challenges: 1) improving the classification performance of AK-LS-SVM and 2) saving time when performing a grid search for model selection. Inspired by the stacked generalization principle and the transfer learning mechanism, a layer-by-layer combination of AK-LS-SVM classifiers embedded with transfer learning is proposed in this paper. This new classifier is called deep transfer additive kernel least square support vector machine (DTA-LS-SVM) which overcomes these two challenges. Also, considering that imbalanced datasets are involved in many real-world scenarios, especially for medical data analysis, the deep-transfer element is extended to compensate for this imbalance, thus leading to the development of another new classifier iDTA-LS-SVM. In the hierarchical structure of both DTA-LS-SVM and iDTA-LS-SVM, each layer has an AK-LS-SVM and the predictions from the previous layer act as an additional input feature for the current layer. Importantly, transfer learning is also embedded to guarantee generalization consistency between the adjacent layers. Moreover, both iDTA-LS-SVM and DTA-LS-SVM can ensure the minimal leaveone-out error by using the proposed fast leave-one-out cross validation strategy on the training set in each layer. We compared the proposed classifiers DTA-LS-SVM and iDTA-LS-SVM with the traditional LS-SVM and SVM using additive ...

Item Type: Journal Article
Publisher: IEEE
Copyright: © 2017 IEEE
Item Control Page Item Control Page