WSEAS Transactions on Circuits and Systems
Print ISSN: 1109-2734, E-ISSN: 2224-266X
Volume 17, 2018
Neural Network and SVM classification via Decision Trees, Vector Quantization and Simulated Annealing
Author:
Abstract: This work provides a method for classification using a Support Vector Machine (SVM) via a Decision Tree algorithm and with Vector Quantization. A probabilistic Decision Tree algorithm focusing on large frequency classes (DTPL) is developed. A method for SVM classification (DT_SVM) using Tabu Search (TS) via DTs is developed. In order to reduce the training complexity of the Support Vector Machine (SVM), the DTPL performs partitions that can be treated as clusters. The TS algorithm can provide the ability to approximate the decision boundary of an SVM. Based on DTs, a SVM algorithm is developed to improve the training time of the SVM considering a subset of the cluster’s instances. To reduce the SVM training set size a vector quantization algorithm (the LBG) is used. The LBG classifier is based on Euclidean Distance. Finally, an optimization method, the Simulated Annealing (SA), is applied over the quantization level for discovering of a minimization criterion based on error and low complexity to support the SVM operation. The V_S_SVM can provide lower error at a reasonable computational complexity. A Neural Network (NN) is composed of many neurons that are linked together according to a specific network topology. Main characteristics of SVM and NN are presented. Comparison between NN and SVM with two types of kernels show the superiority of the SVM. The V_S_SVM with RBF kernel can be compared with DT_SVM and provide useful results. Simulation results for all the algorithms with different complexity data sets are provided
Search Articles
Pages: 19-25
WSEAS Transactions on Circuits and Systems, ISSN / E-ISSN: 1109-2734 / 2224-266X, Volume 17, 2018, Art. #3