Thursday , March 28 2024

Volume 18-Issue2-2009-SIFAOUI

On New RBF Neural Network Construction Algorithm for Classification

Amel SIFAOUI 
U. R. Automatique, ENIT
École Nationale d’Ingénieurs de Tunis (ENIT)
BP 37, Le Belvédère 1002 Tunis, Tunisie

Afef ABDELKRIM
U.R.Automatique, ENIT
Ecole Supérieure de Technologie et d’Informatique (ESTI)

Sonia ALOUANE 
École Nationale d’Ingénieurs de Tunis (ENIT)
BP 37, Le Belvédère 1002 Tunis, Tunisie

Mohamed BENREJEB 
U.R.Automatique, ENIT
École Nationale d’Ingénieurs de Tunis (ENIT)
BP 37, Le Belvédère 1002 Tunis, Tunisie

Abstract: The proposed method to construct a Radial Basis Function (RBF) neural network classifier is based on the use of a new algorithm for characterizing the hidden layer structure. This algorithm, called HNEM-k-means, groups the training data class by class in order to calculate the optimal number of clusters in each class, using new global and local evaluations of the partitions, obtained by the k-means algorithm. Two examples of data sets are considered to show the efficiency of the proposed approach and the obtained results are compared with previous existing classifiers.

Keywords: Radial Basis Function neural network, classification, k-means, validity indexes.

Amel Sifaoui has obtained the Master of Science in Computer Science in 1998 at Faculty of Sciences of Tunis (FST) and the PhD in Electrical Engineering at the National School of Engineers of Tunis (ENIT) in 2008; she is now a Laboratory Assistant at the Faculty of Sciences of Gafsa. Her research interests are in the artificial neuronal network domain.

Afef Abdelkrim has obtained a Diploma of Engineer in 2000 and the PhD in Electrical Engineering at the National School of Engineers of Tunis (ENIT) in 2005; she is now an Assistant Professor at the High School of Technology and Computer Science (ESTI). Her research interests are in handwriting process modelling and synthesis and in the artificial neuronal network domain.

Sonia Alouane has obtained a Graduate Diploma in Computer Science applied to management in 1994 at Institut Supérieur de Gestion Tunis and a post Graduate Diploma a specified for teaching Computer Science and technologie in National School of Computer Science (ENSI) called “Technologue” in 2000. Presently, she is teaching as “Assistante technologue” at the National School of Engineers of Tunis (ENIT), her research interests are in the artificial neuronal network domain.

Mohamed Benrejeb has obtained the Diploma of “Ingénieur IDN” (French “Grande Ecole”) in 1973, the Master degree of Automatic Control in 1974, the PhD in Automatic Control of the University (USTL) of Lille in 1976 and the DSc of the same University in 1980. He is currently a full Professor at the Ecole Nationale d’Ingénieurs de Tunis (Tunisia) and an Invited Professor at the Ecole Centrale de Lille (France). His research interests are in the area of analysis and synthesis of complex systems based on classical and non conventional approaches and recently in discrete event system domain.

>Full text
CITE THIS PAPER AS:
Amel SIFAOUI, Afef ABDELKRIM, Sonia ALOUANE, Mohamed BENREJEB, On New RBF Neural Network Construction Algorithm for Classification, Studies in Informatics and Control, ISSN 1220-1766, vol. 18 (2), pp. 103-110, 2009.

1. Introduction

Introduced into the neural network literature by Broomhead and Lowe [1], the radial basis function neural networks have been widely used for function approximation, pattern classification and recognition due to their structural simplicity and faster learning abilities [2, 3]. However, their design still remains a difficult task due to the absence of systematic method giving an optimal architecture. A too small network cannot well learn the problem, but a too large network will lead to over-fitting and poor generalization performance [4].

A very important step for the RBF network training is to decide a proper number of hidden neurons (number of basis functions), because it controls the complexity and the generalization ability of RBF networks, so many works have proposed RBF classifier conception algorithms [5-11].

The radial basis function network is three layers feedback network, typically used for supervised classification. Its training procedure is usually split into two successive steps. First, the centers of the hidden layer neurons are selected by clustering algorithms such as k-means [11, 12], vector quantization [13], decision trees [14], and then the widths are calculated [15]. Second, the weights connecting the hidden layer with the output layer are determined by Singular Value Decomposition (SVD) or by Least Mean Squared (LMS) algorithms [16].

In this paper, a new learning algorithm is proposed for construction of the radial basis function networks solving classification problems. It determines the proper number of hidden neurons and calculates the centers values of the radial basis functions. After the selection of the centers of the hidden neurons, the widths of the nodes are determined by the P-nearest neighbour heuristic, and the weights between the hidden layer and the output layer calculated by the pseudo-inverse matrix.

The proposed approach consists of combining new evaluation measurement with the k-means algorithm, which have led to a new algorithm, called HNEM-k-means, allowing the automatic determination of the clusters number in the data of each class. Two different real databases are used in order to evaluate the classifier performances.

6. Conclusion

In this paper, a new algorithm, to design RBF neural networks classifiers and to select the centers of the hidden layer neurons in particularly, is proposed.

Based on pruning technique, it aims to construct the hidden layer of an RBF neural network and starts with the maximum number of groups, which is decreased during the different iterations of the algorithm.

The basic idea of this approach is to gather the training data class by class and to decide of the optimal number of groups in each class by using the proposed validity indexes measure which are integrated in the k-means algorithm.

The obtained classifier results are satisfactory in comparison with other considered classifiers in the literature for two real databases.

REFERENCES

  1. Broomhead, D. S., Lowe D., Multivariable functional interpolation and adaptive networks, Complex Syst., vol. 2, 1988, pp. 321-355.
  2. Zhang, G.P., Neural Netwotks for Classification: A survey, IEEE Transaction on Systems, Man, and Cybernetics, part C: Application and reviews, vol. 30, no. 4, 2000.
  3. Verleysen, M., Hlavackava K., An optimized RBF network for approximation of functions, European Symposium on Artificial Neural Networks, Brussels, 1994, pp. 175-180.
  4. Kwok, T.Y., Yeung D.Y., Constructive algorithms for structure learning in feed forward neural networks for regression problems, IEEE Trans. on Neural Networks, vol. 8, no. 3, 1997, pp. 630-645.
  5. Sing J.K., Basu D.K., Nasipuri M., Kundu M., Improved K- means Algorithm in the Design of RBF Neural Networks, IEEE Conf. on Convergent Technologies for the Asia-Pacific, vol. 2, Tencon 2003, Bangalore, pp. 841-845.
  6. Karayiannis N.B., Randolph-Gips, M.M., On the Construction and Training of Reformulated Radial Basis Function Neural Networks, IEEE Trans. on Neural Networks, vol. 14, no. 4, 2003, pp. 835-845.
  7. Mao, K.Z., Huang G.B., Neuron selection for RBF neural network classifier based on data structure preserving criterion, IEEE Trans. on Neural Networks, vol. 16, Issue: 6, 2005, pp. 1531-1540.
  8. Yu, B., He X., Training Radial Basis Function Networks with Differential Evolution, Proceedings of World Academy of Science Engineering and Technology, vol. 11, 2006, Helsinki, Finland, pp. 157-160.
  9. Belloir, F., Fache A., Billat A., A General Approach to Construct RBF Net-Based Classifier, European Symposium on Artificial Neural Networks, Bruges, 1999, pp. 399-404.
  10. Saratchandran, P. Sundararajan N., A generalized growing and pruning RBF (GGAP-RBF) neural network for function approximation, IEEE Trans. on Neural Networks, vol. 16, Issue 1, 2005, pp. 57-67.
  11. Moody, J. Darken C.J., Fast learning in networks of locally-tuned processing units, Neural Computation, 1989, pp. 281-294.
  12. Xu, R., Wunsch D., Survey of Clustering Algorithms, IEEE Trans. on Neural Networks, vol. 16, no. 3, 2005, pp. 645-678.
  13. Vogt, M., Combination of Radial Basis Function Neural Networks with Optimized Learning Vector Quantization, IEEE Int. Conf. on Neural Networks, San Francisco, 1993 pp. 1841-1846.
  14. Kubat, M, Decision Trees Can Initialize Radial-Basis Function Networks, IEEE Trans. on Neural Networks, vol. 9, Issue 5, 1998, pp. 13-821.
  15. Benoudjit, N., Archambeau C., Lendasse A., Lee J., Verleysen M., Width optimization of the Gaussian kernels in Radial Basis Function Networks, European Symposium on Artificial Neural Networks, Bruges, 2002, pp. 425-432.
  16. Borne, P., Benrejeb M., Haggège J., Les réseaux de neurones: Présentation et application, Ed. Technip, Paris, 2007.
  17. Shena, M.J., Changa S.I., Leea E.S., Dengb Y., Brownb S.J., Determination of cluster number in clustering microarray data, Applied Mathematics and Computation, vol. 169, Issue 2, 2005, pp. 1172-1185.
  18. Halkidi, M., Batistakis Y. , Vazirgiannis M., Clustering Algorithms and Validity, 13th Int. Conf. on Scientific and Statistical Database Management, 2001, Virginia, pp. 3-22.
  19. Halkidi, M., Batistakis Y., Vazirgiannis M., On Clustering Validation Techniques, J. of Intelligent Info. Syst., vol. 17, 2001, pp. 107-145.
  20. Sassi, M., Grissa-Touzi A., Ounelli H., Using Gaussians Functions to Determine Representative Clustering Prototypes, 17th IEEE Int. Conf. on Database and Expert Systems Applications, 2006, pp. 435-439.
  21. Bezdek, J.C., Convergence Theory for Fuzzy C-Means: Counter Examples and Repairs, IEEE Trans. on Systems, 1987, pp. 873-877.
  22. UCI Machine Learning Repository, http://www.ics.uci.edu/learn/MLRepository.html.
  23. Hwang, Y.S., Bang S.Y., An Efficient Method to Construct a Radial Basis Function Network Classifier, Neural Networks, vol. 10, no. 8, 1997, pp. 1495-1503.