Mathematics and Computer Science
Volume 1, Issue 1, May 2016, Pages: 1-4

An Improved PSO-SVM Algorithm for License Plate Recognition

Weichao Jiao, Junfei Dong

College of Mathematic and Information, China West Normal University, Nanchong Sichuan, China

Email address:

(Weichao Jiao)

To cite this article:

Weichao Jiao, Junfei Dong. An Improved PSO-SVM Algorithm for License Plate Recognition. Mathematics and Computer Science. Vol. 1, No. 1, 2016 pp. 1-4. doi: 10.11648/j.mcs.20160101.11

Received: March 7, 2016; Accepted: March 16, 2016; Published: April 10, 2016


Abstract: Support vector machine is a machine learning algorithm with good performance, its parameters have an important influence on accuracy of classification, and parameters selection is becoming one of the main research areas of machine learning. This paper adopt support vector machine to recognize the characters of license plate. But in order to get good parameters of support vector machine, this paper has proposed a modified particles warm optimization algorithm to obtain the parameters of support vector machine. Experiments show that the proposed algorithm has higher recognition accuracy than others, the character recognition accuracy of training set is 99.95%, and character recognition accuracy of test set reaches 98.87%.

Keywords: Support Vector Machine, Particle Swarm Optimization Algorithm, Parameter Optimization, Character Recognition


1. Introduction

Automatic license plate recognition system is one of the key technologies of intelligent transportation system. License plate recognition system mainly includes three parts: license plate location, character segmentation and character recognition. Among them, the character recognition is the key part of the license plate recognition system, and the accuracy of recognition results will directly affect the performance of license plate recognition system. At present, the commonly used methods of license plate character recognition are mainly based on template matching method [1-2], neural network method [2-3] and soon. Proceed from the over all of the characters, template matching is to match the testing character with the standard template character, and then calculate their correlation for identification. This method is fast to identify and has high accuracy, but the license plate pictures collected in the natural environment often exist rotation or characters fade, deformation, etc., which will cause serious influence to the result of the identification; Neural network has strong curve fitting and pattern classification ability, which make it have been widely used in character recognition, but the convergence speed of algorithm is slow, the results easily fall into local minimum and other short comings.

Because that SVM (Support Vector Machine), has small sample, non linear, high dimensional pattern recognition features, it has been more and more widely used in character recognition. However, different parameters will affect the performance of SVM model, Zheng [4-5] has analyzed the influence of the kernel function parameters and the penalty factor on SVM performance, but did not give as pecific selection method. Currently the most commonly methods of parametersandoptimization are empirical selection method; grid search method [6-7]; genetic algorithms [8-9]; ant colony algorithm [10-11]; particles warm optimization [12-13] and others warm intelligence optimization algorithm. This paper presents an improved PSO-SVM (Particle Swarm Optimization-Support Vector Machine) algorithm. Firstly, we use PSO to select the parameters C,of SVM; the n we build a SVM classification model to recognize license plate characters. Experimental results show that our algorithm has a higher accuracy than SVM, Grid-SVM and GA-SVM.

2. SVM and PSO

2.1. SVM

SVM is a machine learning method based on statistical learning theory proposed by Vapnik in 1995. It is based on the theory of empirical risk and the minimization sum of the confidence risk also called the structural risk minimization principle. The purpose of the study was to find a hyper plane which can accurately classify and make two kinds of sample points to its minimum and maximum distance. Suppose training set , where  and  is the feature vector.

Lagrange multiplier method and kernel will be used to look for separating hyperplane, then the problem become into:

(1)

Where  is Lagrange multiplier; is kernel function. The optimal solution is .

Select a positive component  which is located in the open interval , and calculate the classification threshold according to equation (2):

(2)

Constructed the decision function as equation (3):

(3)

The above algorithm can only solve the problem having only two categories. For multi-classification problem, SVM using the idea of one-one to get it come true, by using the cross-validation group  (k-fold cross validation, ) ideological training, put the data into  groups, each one of those groups as a test set in turn, and the rest as a training set, then we can get  models, use the average of these  models classification accuracy as the  performance of classifier.

There are many kinds of kernel functions for support vector machine: Linear kernel, Polynomial kernel, radial basis function kernel, sigmoid kernel function, etc.

2.2. Standard PSO Optimization Algorithm

PSO optimization algorithm derived from the study of birds’ predatory behavior. The basic idea is to find the optimal solution through collaboration and information sharing of the individuals in the groups.

Basic PSO optimization algorithm is described as follows: supposed that reset the target search space as D-dimensional, the whole group consists of Mparticles, the current position of the particle  is , the flight speed is , the optimal search position of particle  is , the optimal position of the entire population has searched is . In every iteration, the particles update their speed and position pass through individual extreme and global extreme. The update formula is shown in (4), (5).

(4)

(5)

Among them, as the speed of the particles is ; as the inertia weight is ; as  is learning factor, they adjust the maximum step of the particles flight to the beat position of itself and the groups optimal position, usually ;  is distributed in  random number.

3. The improved PSO Optimization Algorithm

3.1. Selection of Inertia Weight

Inertia weight is reflected in the extent to which the current speed of the particles inherit the previous rate, the larger inertia weight is in favor of global search, while the smaller is more conducive to local search. In order to better balance the global search and local search ability of the algorithm, using LDIW (Linear Decreasing Inertia Weight,)

(6)

Where  is the initial inertia weight;  is the inertia weight when iteration the maximum;  is the current iteration of algebraic;  is the maximum iteration algebra.

3.2. Adaptive Mutation

PSO algorithm is easy to premature convergence and low precision, low efficiency of the late iteration short comings. Reference GA variability thought, adding mutation in PSO to improve the diversity of the population.

In PSO introduced mutation operator: the basic idea of particle after each update, to initialize the particle at a certain probability. Mutation operators expand the iteration again narrowing of search space, make the particles out of the previous search to search for the optimal position into other areas, and improve the algorithm to find the possibility of a more optimal value.

4. SVM Parameter Optimization Based on Improved PSO

4.1. SVM Parameters C and g

Studies have shown that RBF is a common kernel function [14], and in a variety of patterns recognition it always has a good result. In this paper, the function is used to carry out experiments, which is shown in equation (7).

(7)

Where  is tunable, it directly affect the classifier recognition.

The penalty factoris also an important parameter which affects the performance of the classifier.

Penalty factor is a marked degree of punishment sample misclassification,  value is larger training that emphasizes a smaller error,  value is small that emphasizes the interval is large. It is expected to find a classifier that has better generalization ability.

4.2. Select the Fitness Function

The quality of the fitness function is a key measure of PSO. In the PSO-SVM each particle represents a set of parameters. Performance of the algorithm is the particle corresponding fitness values which take the set of parameters.

This paper selects CV (a statistical analysis method used to verify the performance of classification, the basic idea is that in some sense, the original data is divided into two parts one is the training set, the other is the test set. The classifier is trained with training set first, then use the test set to test the resulting model. Finally put the resulting classification accuracy as the evaluation classifier performance indicators) sense of accuracy as fitness function in PSO.

4.3. PSO-SVM Algorithm

The process of optimizing SVM parameters by using PSO is shown in Figure1.

Figure 1. The process of optimizing SVM parameters by using PSO.

5. The Simulation

5.1. Data sets and Parameter Settings

The license plate characters are usually made of 31 characters, 25 capital English letters a-z and 10 digits 0-9. We collect 600 license plate character images as the training set, 200 license plate character images as the test set. Each image size is 32×16 pixels. The experimental hardware environment is: CPU clock is 3.30 GHz, memory size is 8.00GB.

SVM kernel function is shown in Equation (7), PSO population size is 20 and the maximum number of iterations is 200. , . Inertia weight  ; the value of penalty factoris , the value of kernel function parameters  is .

5.2. Experimental Results

In order to compare, we optimize the SVM parameter by using standard PSO and improved PSO respectively. The optimization results are shown in Figure 2 and 3, which illustrate that improved PSO is better than standard PSO.

Figure 2. PSO-SVM optimization.

Figure 3. Improved PSO-SVM optimization.

The improved PSO-SVM classification model is obtained by training the training set, and then to test the recognition accuracy on the training set and test set, respectively. We also test recognition accuracy by applying other methods. The results are shown in Table1. From Table1, it can be seen that our improved PSO-SVM method has better recognition accuracy than other methods on both the training set and the test set.

Table 1. Comparison of recognition accuracy.

method Number recognition Letter recognition Chinese recognition
Training set Test set Training set Test set Training set Test set
SVM 95% 92% 93% 85% 88% 80%
Grid-SVM 92% 83% 92% 89% 90% 83%
GA-SVM 98% 94% 95% 92% 94% 79%
Improve PSO-SVM 100% 98% 99% 97% 98% 95%

6. Summary

This paper put forward an improved PSO-SVM algorithm to recognize the characters on the license plate. Firstly, the improved PSO is used to optimize the SVM parameters, and then use the obtained improved PSO-SVM model to recognize the characters. Experimental results show that our method has higher recognition accuracy than others.

Acknowledge: This paper is supported by the innovation team of china west normal university (CXTD2014-4)


References

  1. Wu Wei, Xinhan Huang, Qi-sen Zhang, etal. License plate character recognition based on template matching and neural network. Pattern Recognition and Aitificial Intelligence, 2001, 13(01): 123-127.
  2. Min Wang, Xinhan Huang, Wu Wei, etal. License plate character recognition method for template matching and neural network. Journal of Huazhong University of Science and Technology (Natural Science Edition), 2001, 29(03): 48-50.
  3. Rongyan Guo, Xuehui Hu. Application of BP neural network in vehicle license plate character recognition. Computer simulation, 2010, 27(09): 299-301+350.
  4. Xiaoxia Zheng, Feng Qian. Gaussian kernel SVM and model parameters Selection. Computer Engineering and Applications, 2006, 42(1): 77-79.
  5. Xiaoyun Zhang, Yuncai Liu. Performance analysis of Gaussian kernel support vector machine. Computer Engineering, 2003, 29(8): 22-25.
  6. Chao Guo, Weihua Song, Wei Wei. Stope roof stabilit y prediction based on both SVM and grid-search method. China Safety Science Journal, 2014, 24(8): 31-36.
  7. Qingyi Li, Hao Zhou, Aping Lin, etal. Prediction of ash fusion temperature based on grid search and support vector machine. Journal of Zhejiang University (Engineering Science), 2011, 45(12): 2181-2187.
  8. Wu CH, Tzeng GH, Goo YJ, etal. Areal-valued genetic algorithm to optimize the parameters of support vector machine for predicting bankruptcy. Expert Systems with Applications, 2007, 32(2): 397-408.
  9. Min SH, Lee J, Han I. Hybrid genetic algorithms and support vector machines for bankruptcy prediction. Expert Systems with Applications, 2006, 31(3); 652-660.
  10. Yuanliang Pan, Jun Du, Weitao Hu. Air to ground target optimal tracking method by ant colony optimization-based SVM. Journal of Computational Information Systems, 2014, 10(5): 1805-1810.
  11. Al Dulaimi H B A, Ku Mahamud. Solving SVM model selection problem using ACOR and IACOR. WSEAS Transactions on Computers, 2013, 12(9): 1109-2750.
  12. Huang LC, Dun FJ. A distributed PSO-SVM hybrid system with feature selection and parameter optimization. Applied Soft Computing, 2008, 8(4): 1381-1391.
  13. Weisheng Fei, Junming Wang, Binyu Miao, etal. Particle swarm optimization based support vector machine for forecasting dissolved gases content in power transformer oil. Energy Conversion and Management, 2009, 50(6): 1604-1609.
  14. Tao Wu. Kernels’ properties, tricks and its applications on obstacle detection. Changsha: College of Mechatronic Engineering and Automation, National University of Defense Technology, 2003: 18-23.

Article Tools
  Abstract
  PDF(563K)
Follow on us
ADDRESS
Science Publishing Group
548 FASHION AVENUE
NEW YORK, NY 10018
U.S.A.
Tel: (001)347-688-8931